Oct 02 10:55:22 crc systemd[1]: Starting Kubernetes Kubelet... Oct 02 10:55:22 crc restorecon[4734]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:55:22 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:23 crc restorecon[4734]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:55:23 crc restorecon[4734]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 02 10:55:23 crc kubenswrapper[4835]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 10:55:23 crc kubenswrapper[4835]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 02 10:55:23 crc kubenswrapper[4835]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 10:55:23 crc kubenswrapper[4835]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 10:55:23 crc kubenswrapper[4835]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 02 10:55:23 crc kubenswrapper[4835]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.974237 4835 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978548 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978587 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978593 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978630 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978638 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978644 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978651 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978659 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978667 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978673 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978681 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978686 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978692 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978698 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978705 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978710 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978715 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978720 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978725 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978731 4835 feature_gate.go:330] unrecognized feature gate: Example Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978736 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978741 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978746 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978751 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978763 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978768 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978773 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978777 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978782 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978786 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978791 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978796 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978801 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978806 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978812 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978817 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978821 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978825 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978829 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978834 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978839 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978845 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978852 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978859 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978866 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978872 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978878 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978883 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978888 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978893 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978897 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978901 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978906 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978910 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978915 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978919 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978924 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978929 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978933 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978938 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978942 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978947 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978952 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978957 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978961 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978975 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978980 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978984 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978988 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978993 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.978997 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979129 4835 flags.go:64] FLAG: --address="0.0.0.0" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979145 4835 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979155 4835 flags.go:64] FLAG: --anonymous-auth="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979163 4835 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979171 4835 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979178 4835 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979187 4835 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979196 4835 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979202 4835 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979207 4835 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979213 4835 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979235 4835 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979242 4835 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979247 4835 flags.go:64] FLAG: --cgroup-root="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979252 4835 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979258 4835 flags.go:64] FLAG: --client-ca-file="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979264 4835 flags.go:64] FLAG: --cloud-config="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979269 4835 flags.go:64] FLAG: --cloud-provider="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979274 4835 flags.go:64] FLAG: --cluster-dns="[]" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979282 4835 flags.go:64] FLAG: --cluster-domain="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979287 4835 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979292 4835 flags.go:64] FLAG: --config-dir="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979297 4835 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979303 4835 flags.go:64] FLAG: --container-log-max-files="5" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979311 4835 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979317 4835 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979323 4835 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979328 4835 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979334 4835 flags.go:64] FLAG: --contention-profiling="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979339 4835 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979344 4835 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979350 4835 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979356 4835 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979364 4835 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979370 4835 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979378 4835 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979384 4835 flags.go:64] FLAG: --enable-load-reader="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979389 4835 flags.go:64] FLAG: --enable-server="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979395 4835 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979402 4835 flags.go:64] FLAG: --event-burst="100" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979408 4835 flags.go:64] FLAG: --event-qps="50" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979413 4835 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979419 4835 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979425 4835 flags.go:64] FLAG: --eviction-hard="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979433 4835 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979438 4835 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979444 4835 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979449 4835 flags.go:64] FLAG: --eviction-soft="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979457 4835 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979462 4835 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979468 4835 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979473 4835 flags.go:64] FLAG: --experimental-mounter-path="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979479 4835 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979485 4835 flags.go:64] FLAG: --fail-swap-on="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979491 4835 flags.go:64] FLAG: --feature-gates="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979498 4835 flags.go:64] FLAG: --file-check-frequency="20s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979504 4835 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979510 4835 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979516 4835 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979521 4835 flags.go:64] FLAG: --healthz-port="10248" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979527 4835 flags.go:64] FLAG: --help="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979532 4835 flags.go:64] FLAG: --hostname-override="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979538 4835 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979543 4835 flags.go:64] FLAG: --http-check-frequency="20s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979549 4835 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979554 4835 flags.go:64] FLAG: --image-credential-provider-config="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979561 4835 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979567 4835 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979572 4835 flags.go:64] FLAG: --image-service-endpoint="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979578 4835 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979584 4835 flags.go:64] FLAG: --kube-api-burst="100" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979589 4835 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979595 4835 flags.go:64] FLAG: --kube-api-qps="50" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979601 4835 flags.go:64] FLAG: --kube-reserved="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979607 4835 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979612 4835 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979618 4835 flags.go:64] FLAG: --kubelet-cgroups="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979623 4835 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979629 4835 flags.go:64] FLAG: --lock-file="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979635 4835 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979641 4835 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979647 4835 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979655 4835 flags.go:64] FLAG: --log-json-split-stream="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979661 4835 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979666 4835 flags.go:64] FLAG: --log-text-split-stream="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979671 4835 flags.go:64] FLAG: --logging-format="text" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979677 4835 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979683 4835 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979688 4835 flags.go:64] FLAG: --manifest-url="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979693 4835 flags.go:64] FLAG: --manifest-url-header="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979701 4835 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979707 4835 flags.go:64] FLAG: --max-open-files="1000000" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979714 4835 flags.go:64] FLAG: --max-pods="110" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979719 4835 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979724 4835 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979730 4835 flags.go:64] FLAG: --memory-manager-policy="None" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979736 4835 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979741 4835 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979749 4835 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979755 4835 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979769 4835 flags.go:64] FLAG: --node-status-max-images="50" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979775 4835 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979780 4835 flags.go:64] FLAG: --oom-score-adj="-999" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979786 4835 flags.go:64] FLAG: --pod-cidr="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979791 4835 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979802 4835 flags.go:64] FLAG: --pod-manifest-path="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979807 4835 flags.go:64] FLAG: --pod-max-pids="-1" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979813 4835 flags.go:64] FLAG: --pods-per-core="0" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979818 4835 flags.go:64] FLAG: --port="10250" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979824 4835 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979829 4835 flags.go:64] FLAG: --provider-id="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979835 4835 flags.go:64] FLAG: --qos-reserved="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979840 4835 flags.go:64] FLAG: --read-only-port="10255" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979845 4835 flags.go:64] FLAG: --register-node="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979851 4835 flags.go:64] FLAG: --register-schedulable="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979862 4835 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979872 4835 flags.go:64] FLAG: --registry-burst="10" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979877 4835 flags.go:64] FLAG: --registry-qps="5" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979882 4835 flags.go:64] FLAG: --reserved-cpus="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979887 4835 flags.go:64] FLAG: --reserved-memory="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979895 4835 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979901 4835 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979906 4835 flags.go:64] FLAG: --rotate-certificates="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979912 4835 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979919 4835 flags.go:64] FLAG: --runonce="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979924 4835 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979930 4835 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979936 4835 flags.go:64] FLAG: --seccomp-default="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979941 4835 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979946 4835 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979952 4835 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979960 4835 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979965 4835 flags.go:64] FLAG: --storage-driver-password="root" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979971 4835 flags.go:64] FLAG: --storage-driver-secure="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979977 4835 flags.go:64] FLAG: --storage-driver-table="stats" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979983 4835 flags.go:64] FLAG: --storage-driver-user="root" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979988 4835 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.979994 4835 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980000 4835 flags.go:64] FLAG: --system-cgroups="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980006 4835 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980016 4835 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980022 4835 flags.go:64] FLAG: --tls-cert-file="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980028 4835 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980036 4835 flags.go:64] FLAG: --tls-min-version="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980042 4835 flags.go:64] FLAG: --tls-private-key-file="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980048 4835 flags.go:64] FLAG: --topology-manager-policy="none" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980054 4835 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980060 4835 flags.go:64] FLAG: --topology-manager-scope="container" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980066 4835 flags.go:64] FLAG: --v="2" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980075 4835 flags.go:64] FLAG: --version="false" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980083 4835 flags.go:64] FLAG: --vmodule="" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980091 4835 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980098 4835 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980262 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980270 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980277 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980281 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980286 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980293 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980299 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980304 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980310 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980316 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980327 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980333 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980337 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980343 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980349 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980354 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980359 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980364 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980369 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980375 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980381 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980386 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980392 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980397 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980402 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980407 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980412 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980417 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980422 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980427 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980432 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980436 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980441 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980446 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980453 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980458 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980464 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980470 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980475 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980480 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980485 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980490 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980497 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980501 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980506 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980512 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980518 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980522 4835 feature_gate.go:330] unrecognized feature gate: Example Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980528 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980533 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980539 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980544 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980549 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980554 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980559 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980564 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980569 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980574 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980579 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980583 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980588 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980592 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980598 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980602 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980607 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980611 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980616 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980620 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980625 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980629 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.980637 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.980653 4835 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.993217 4835 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.993309 4835 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993485 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993511 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993525 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993538 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993547 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993556 4835 feature_gate.go:330] unrecognized feature gate: Example Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993564 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993572 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993581 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993589 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993599 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993610 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993619 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993629 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993640 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993648 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993657 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993665 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993673 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993681 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993689 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993696 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993704 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993712 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993719 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993727 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993735 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993743 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993751 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993759 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993767 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993775 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993782 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993789 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993799 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993807 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993816 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993823 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993831 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993840 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993848 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993856 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993863 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993871 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993879 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993886 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993896 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993904 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993911 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993920 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993927 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993935 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993943 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993950 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993958 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993967 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993976 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993985 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.993994 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994008 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994018 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994026 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994034 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994042 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994050 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994058 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994066 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994073 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994081 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994091 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994103 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.994116 4835 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994367 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994382 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994391 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994402 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994410 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994418 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994426 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994434 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994443 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994451 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994458 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994466 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994474 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994482 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994490 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994498 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994505 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994513 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994521 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994530 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994538 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994546 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994553 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994561 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994570 4835 feature_gate.go:330] unrecognized feature gate: Example Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994580 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994590 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994599 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994608 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994617 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994626 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994637 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994646 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994657 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994703 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994714 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994722 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994730 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994738 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994747 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994755 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994763 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994770 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994778 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994786 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994794 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994801 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994809 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994819 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994828 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994836 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994844 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994852 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994861 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994869 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994877 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994885 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994892 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994900 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994908 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994916 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994924 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994932 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994940 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994947 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994955 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994963 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994974 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994985 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.994994 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 10:55:23 crc kubenswrapper[4835]: W1002 10:55:23.995004 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.995020 4835 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 10:55:23 crc kubenswrapper[4835]: I1002 10:55:23.996321 4835 server.go:940] "Client rotation is on, will bootstrap in background" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.002481 4835 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.002624 4835 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.004807 4835 server.go:997] "Starting client certificate rotation" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.004856 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.005202 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-29 22:16:38.550098612 +0000 UTC Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.005328 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1403h21m14.544775466s for next certificate rotation Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.041552 4835 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.047359 4835 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.068269 4835 log.go:25] "Validated CRI v1 runtime API" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.108697 4835 log.go:25] "Validated CRI v1 image API" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.111190 4835 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.117444 4835 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-02-10-50-13-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.117492 4835 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.151560 4835 manager.go:217] Machine: {Timestamp:2025-10-02 10:55:24.147179146 +0000 UTC m=+0.707086807 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0e54c7ff-993a-4ab6-9817-7e5a943ad8d7 BootID:9e30b685-d777-4f6c-84ea-b9aec6204c89 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:24:95:9a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:24:95:9a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a0:d8:d9 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:7a:d9:e9 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:72:65:15 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:17:61:24 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:98:8e:d5 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e2:bb:57:24:90:39 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ee:5f:f4:fd:d2:52 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.152030 4835 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.152336 4835 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.154967 4835 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.155452 4835 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.155529 4835 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.155904 4835 topology_manager.go:138] "Creating topology manager with none policy" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.155928 4835 container_manager_linux.go:303] "Creating device plugin manager" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.156570 4835 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.157682 4835 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.158090 4835 state_mem.go:36] "Initialized new in-memory state store" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.158742 4835 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.164342 4835 kubelet.go:418] "Attempting to sync node with API server" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.164383 4835 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.164430 4835 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.164467 4835 kubelet.go:324] "Adding apiserver pod source" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.164491 4835 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 02 10:55:24 crc kubenswrapper[4835]: W1002 10:55:24.174022 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.174186 4835 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.174390 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:55:24 crc kubenswrapper[4835]: W1002 10:55:24.174019 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.174507 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.175497 4835 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.178595 4835 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180471 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180518 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180534 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180548 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180572 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180586 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180600 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180627 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180644 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180659 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180681 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180696 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.180743 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.182006 4835 server.go:1280] "Started kubelet" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.182413 4835 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.184401 4835 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 02 10:55:24 crc systemd[1]: Started Kubernetes Kubelet. Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.186917 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.188479 4835 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.192447 4835 server.go:460] "Adding debug handlers to kubelet server" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.196295 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.196481 4835 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.198933 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 20:11:09.252158956 +0000 UTC Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.199198 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1089h15m45.052969184s for next certificate rotation Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.199556 4835 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.199698 4835 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.200321 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.201295 4835 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.201310 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="200ms" Oct 02 10:55:24 crc kubenswrapper[4835]: W1002 10:55:24.201703 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.201852 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.202685 4835 factory.go:55] Registering systemd factory Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.202864 4835 factory.go:221] Registration of the systemd container factory successfully Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.203535 4835 factory.go:153] Registering CRI-O factory Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.203576 4835 factory.go:221] Registration of the crio container factory successfully Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.203715 4835 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.202096 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.169:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aa74626ca26f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 10:55:24.181931766 +0000 UTC m=+0.741839387,LastTimestamp:2025-10-02 10:55:24.181931766 +0000 UTC m=+0.741839387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.203765 4835 factory.go:103] Registering Raw factory Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.203808 4835 manager.go:1196] Started watching for new ooms in manager Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.205792 4835 manager.go:319] Starting recovery of all containers Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221560 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221658 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221690 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221714 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221740 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221766 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221789 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221813 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221841 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221864 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221881 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221902 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221919 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221940 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221967 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.221991 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222019 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222041 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222069 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222097 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222122 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222208 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222282 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222308 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222338 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222370 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222402 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222432 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222460 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222486 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222515 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222541 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222573 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222598 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222624 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222653 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222737 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222763 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.222793 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.225917 4835 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.225993 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226031 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226066 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226094 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226139 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226169 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226199 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226377 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226409 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226435 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226464 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226491 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226524 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226558 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226592 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226619 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226645 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226673 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226706 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226736 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226765 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226792 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226818 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226843 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226872 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226898 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226922 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226947 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226972 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.226998 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.227022 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.227050 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.227079 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.227106 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.227133 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.227161 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.227187 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.227214 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228240 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228304 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228355 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228374 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228462 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228483 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228506 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228535 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228557 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228576 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228607 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228631 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228658 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228677 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228697 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228725 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228744 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.228767 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229115 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229228 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229253 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229279 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229311 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229333 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229360 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229386 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229410 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229465 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229503 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229569 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229599 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229632 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229655 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229689 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229719 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229741 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229764 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229785 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229810 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229828 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229853 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229871 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229885 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229910 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229932 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229956 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229971 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.229988 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230010 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230026 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230050 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230066 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230084 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230107 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230125 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230151 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230171 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230191 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230212 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230279 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230307 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230328 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230347 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230371 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230387 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230403 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230426 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230444 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230466 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230483 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230500 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230519 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230534 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230553 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230566 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230580 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230597 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230613 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230633 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230650 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230664 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230683 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230697 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230716 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230731 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230748 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230767 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230782 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230801 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230817 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230837 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230855 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230872 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230893 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230908 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230922 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230948 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230973 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.230994 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231011 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231024 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231042 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231055 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231075 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231091 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231106 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231124 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231140 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231157 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231174 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231188 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231208 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231240 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231259 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231275 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231289 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231305 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231323 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231343 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231358 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231372 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231391 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231405 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231420 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231446 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231462 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231483 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231498 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231514 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231532 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231546 4835 reconstruct.go:97] "Volume reconstruction finished" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.231558 4835 reconciler.go:26] "Reconciler: start to sync state" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.242399 4835 manager.go:324] Recovery completed Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.246190 4835 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.250099 4835 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.250331 4835 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.250479 4835 kubelet.go:2335] "Starting kubelet main sync loop" Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.250681 4835 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 02 10:55:24 crc kubenswrapper[4835]: W1002 10:55:24.252106 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.252416 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.261510 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.263605 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.263670 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.263691 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.265289 4835 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.265318 4835 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.265341 4835 state_mem.go:36] "Initialized new in-memory state store" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.293361 4835 policy_none.go:49] "None policy: Start" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.294828 4835 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.294856 4835 state_mem.go:35] "Initializing new in-memory state store" Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.300484 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.348092 4835 manager.go:334] "Starting Device Plugin manager" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.348218 4835 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.348298 4835 server.go:79] "Starting device plugin registration server" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.349147 4835 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.349187 4835 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.349520 4835 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.349741 4835 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.349752 4835 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.351561 4835 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.351650 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.353070 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.353109 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.353124 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.353295 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.353715 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.353793 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.354621 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.354668 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.354684 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.354910 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.355029 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.355073 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.355178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.355204 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.355231 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.356305 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.356663 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.356688 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.356705 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.356725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.356734 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.356850 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.357060 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.357103 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.358678 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.358710 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.358722 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.358839 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.359182 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.359294 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.360543 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.360584 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.360599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.360987 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.361024 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.361035 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.361053 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.361066 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.361051 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.361432 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.361500 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.363466 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.363529 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.363581 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.363597 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.402300 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="400ms" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433560 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433613 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433634 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433659 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433676 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433718 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433752 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433778 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433802 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433827 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433848 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433870 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433892 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433913 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.433939 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.449697 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.451094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.451139 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.451149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.451184 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.451835 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535687 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535754 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535776 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535810 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535829 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535845 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535860 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535890 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535907 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535906 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535919 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535932 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535981 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.535923 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536003 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536053 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536069 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536089 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536068 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536103 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536118 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536150 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536160 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536177 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536193 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536209 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536257 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536263 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536211 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.536269 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.652086 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.654164 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.654193 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.654202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.654245 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.654473 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.687066 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.698599 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.733134 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: W1002 10:55:24.740922 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-c13e34ead83501b1aafcf1364d4e1eaa53af043f17394ddfe5097ff17905752c WatchSource:0}: Error finding container c13e34ead83501b1aafcf1364d4e1eaa53af043f17394ddfe5097ff17905752c: Status 404 returned error can't find the container with id c13e34ead83501b1aafcf1364d4e1eaa53af043f17394ddfe5097ff17905752c Oct 02 10:55:24 crc kubenswrapper[4835]: W1002 10:55:24.757990 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-60348bfdf31ca837d1646b38b1f05766c33154a9117847081075257dd164ddc1 WatchSource:0}: Error finding container 60348bfdf31ca837d1646b38b1f05766c33154a9117847081075257dd164ddc1: Status 404 returned error can't find the container with id 60348bfdf31ca837d1646b38b1f05766c33154a9117847081075257dd164ddc1 Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.758265 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: I1002 10:55:24.768208 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:24 crc kubenswrapper[4835]: W1002 10:55:24.777317 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a100c6f6c34e4952fd835381bba3e8646211456ec659dbfde520594279961103 WatchSource:0}: Error finding container a100c6f6c34e4952fd835381bba3e8646211456ec659dbfde520594279961103: Status 404 returned error can't find the container with id a100c6f6c34e4952fd835381bba3e8646211456ec659dbfde520594279961103 Oct 02 10:55:24 crc kubenswrapper[4835]: W1002 10:55:24.796126 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-1a35e629314d17427f91ef47302e31a883df770d6c0b59f723301f867cae105e WatchSource:0}: Error finding container 1a35e629314d17427f91ef47302e31a883df770d6c0b59f723301f867cae105e: Status 404 returned error can't find the container with id 1a35e629314d17427f91ef47302e31a883df770d6c0b59f723301f867cae105e Oct 02 10:55:24 crc kubenswrapper[4835]: W1002 10:55:24.802671 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-6fd0228ce2bcf99120a15404edeeea2e34167373c5137b8be646c6dea4e0c4c2 WatchSource:0}: Error finding container 6fd0228ce2bcf99120a15404edeeea2e34167373c5137b8be646c6dea4e0c4c2: Status 404 returned error can't find the container with id 6fd0228ce2bcf99120a15404edeeea2e34167373c5137b8be646c6dea4e0c4c2 Oct 02 10:55:24 crc kubenswrapper[4835]: E1002 10:55:24.803635 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="800ms" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.055274 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.056653 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.056714 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.056728 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.056778 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:55:25 crc kubenswrapper[4835]: E1002 10:55:25.057300 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.188810 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:25 crc kubenswrapper[4835]: W1002 10:55:25.239942 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:25 crc kubenswrapper[4835]: E1002 10:55:25.240046 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.260186 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a100c6f6c34e4952fd835381bba3e8646211456ec659dbfde520594279961103"} Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.261746 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"60348bfdf31ca837d1646b38b1f05766c33154a9117847081075257dd164ddc1"} Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.264031 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c13e34ead83501b1aafcf1364d4e1eaa53af043f17394ddfe5097ff17905752c"} Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.266007 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6fd0228ce2bcf99120a15404edeeea2e34167373c5137b8be646c6dea4e0c4c2"} Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.267083 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1a35e629314d17427f91ef47302e31a883df770d6c0b59f723301f867cae105e"} Oct 02 10:55:25 crc kubenswrapper[4835]: W1002 10:55:25.347158 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:25 crc kubenswrapper[4835]: E1002 10:55:25.347280 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:55:25 crc kubenswrapper[4835]: E1002 10:55:25.468195 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.169:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aa74626ca26f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 10:55:24.181931766 +0000 UTC m=+0.741839387,LastTimestamp:2025-10-02 10:55:24.181931766 +0000 UTC m=+0.741839387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 10:55:25 crc kubenswrapper[4835]: E1002 10:55:25.604442 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="1.6s" Oct 02 10:55:25 crc kubenswrapper[4835]: W1002 10:55:25.788799 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:25 crc kubenswrapper[4835]: E1002 10:55:25.788919 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:55:25 crc kubenswrapper[4835]: W1002 10:55:25.838921 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:25 crc kubenswrapper[4835]: E1002 10:55:25.839024 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.858420 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.861381 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.861463 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.861482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:25 crc kubenswrapper[4835]: I1002 10:55:25.861521 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:55:25 crc kubenswrapper[4835]: E1002 10:55:25.862151 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.188417 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.271774 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49"} Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.271827 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981"} Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.271838 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1"} Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.271846 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5"} Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.271934 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.273105 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.273161 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.273178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.274437 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492" exitCode=0 Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.274523 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492"} Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.274645 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.276207 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.276258 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.276269 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.277117 4835 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="02d4e31ef3f58036a2fe6c09f70020f137b6b9bbabcd3d9e6ecb39e7b38bc380" exitCode=0 Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.277197 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"02d4e31ef3f58036a2fe6c09f70020f137b6b9bbabcd3d9e6ecb39e7b38bc380"} Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.277208 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.277802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.277824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.277833 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.277993 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.279497 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.279524 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.279532 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.280588 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"962f1b88475c1769f073e54459d255dfacd841366c065f3263bb203e1bbbca47"} Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.280632 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.280605 4835 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="962f1b88475c1769f073e54459d255dfacd841366c065f3263bb203e1bbbca47" exitCode=0 Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.282189 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.282366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.282440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.283821 4835 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516" exitCode=0 Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.283854 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516"} Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.283889 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.284613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.284679 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.284691 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.772195 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:26 crc kubenswrapper[4835]: I1002 10:55:26.968604 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:27 crc kubenswrapper[4835]: W1002 10:55:27.086813 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:27 crc kubenswrapper[4835]: E1002 10:55:27.086947 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:55:27 crc kubenswrapper[4835]: W1002 10:55:27.116372 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:27 crc kubenswrapper[4835]: E1002 10:55:27.116480 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.189014 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Oct 02 10:55:27 crc kubenswrapper[4835]: E1002 10:55:27.205395 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="3.2s" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.297613 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603"} Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.297679 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.297698 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd"} Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.297717 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777"} Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.300335 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.300427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.300442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.305672 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125"} Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.305709 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b"} Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.305730 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4"} Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.305747 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b"} Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.308052 4835 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ba6920ca4b62b0e37fe1fa3b7ed75f509a4dbecc41e23ea52c1366a3b3f82f9b" exitCode=0 Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.308105 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ba6920ca4b62b0e37fe1fa3b7ed75f509a4dbecc41e23ea52c1366a3b3f82f9b"} Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.308275 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.309359 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.309388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.309400 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.314582 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"581ca8c1ec33b426223ce2b125e5e8995a3258a92c1ceaca6a2a1333bb35a164"} Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.314608 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.314660 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.315712 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.315753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.315763 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.316418 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.316451 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.316468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.463271 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.464719 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.464746 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.464756 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:27 crc kubenswrapper[4835]: I1002 10:55:27.464780 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:55:27 crc kubenswrapper[4835]: E1002 10:55:27.465244 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.204124 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.322553 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"730423b061cd09f46f64bd681d4abda7405af7a028fed8d7a46d1a6d696d4d13"} Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.322749 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.324569 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.324614 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.324625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.324784 4835 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="54b5275fb5b1530b809ab177485fd261602e5008be9c28381f72fac7ef0fdc85" exitCode=0 Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.324894 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.324873 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"54b5275fb5b1530b809ab177485fd261602e5008be9c28381f72fac7ef0fdc85"} Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.324938 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.325063 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.325713 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.325776 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.325861 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.326719 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.326750 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.326760 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.326768 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.326798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.326809 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.328074 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.328101 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.328122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.328135 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.328108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.328281 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:28 crc kubenswrapper[4835]: I1002 10:55:28.468621 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.332594 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7ed698d93bdcaf46bed5745e06f66ebef02a88d50802e5f7a304b4dba7b31a9d"} Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.332718 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"925933106ef051b2fe7dbd61758717f24d3f8370778fcfec65373884e7b09862"} Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.332744 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"09de897d1ce9009170b27f1e9d924acc713deeb1c11be5b19c573ba4df2d255f"} Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.332763 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c1f781dffd3ee8eb8211c6663574642acd7de8cfe14b99e1c63d65d4c7c19f14"} Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.332662 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.332787 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.332864 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.332877 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.335077 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.335127 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.335192 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.335214 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.335273 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.335299 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.658658 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.658990 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.660834 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.660902 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:29 crc kubenswrapper[4835]: I1002 10:55:29.660919 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.340491 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0938eca70f2e31639a8168c94539e7a2ef5bacbc77795d117ce0392d9a4b52e5"} Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.340519 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.340608 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.341014 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.342441 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.342485 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.342498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.343461 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.343510 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.343527 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.665827 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.668269 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.668331 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.668350 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:30 crc kubenswrapper[4835]: I1002 10:55:30.668388 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:55:31 crc kubenswrapper[4835]: I1002 10:55:31.204651 4835 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 10:55:31 crc kubenswrapper[4835]: I1002 10:55:31.204825 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 10:55:31 crc kubenswrapper[4835]: I1002 10:55:31.343908 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:31 crc kubenswrapper[4835]: I1002 10:55:31.345432 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:31 crc kubenswrapper[4835]: I1002 10:55:31.345504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:31 crc kubenswrapper[4835]: I1002 10:55:31.345519 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:31 crc kubenswrapper[4835]: I1002 10:55:31.541883 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:31 crc kubenswrapper[4835]: I1002 10:55:31.542127 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:31 crc kubenswrapper[4835]: I1002 10:55:31.543570 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:31 crc kubenswrapper[4835]: I1002 10:55:31.543606 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:31 crc kubenswrapper[4835]: I1002 10:55:31.543615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.050694 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.051043 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.052799 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.052864 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.052882 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.184312 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.346693 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.347821 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.347852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.347863 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.988582 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.988855 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.990577 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.990622 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:32 crc kubenswrapper[4835]: I1002 10:55:32.990636 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:33 crc kubenswrapper[4835]: I1002 10:55:33.301057 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:33 crc kubenswrapper[4835]: I1002 10:55:33.301548 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:33 crc kubenswrapper[4835]: I1002 10:55:33.303752 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:33 crc kubenswrapper[4835]: I1002 10:55:33.303830 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:33 crc kubenswrapper[4835]: I1002 10:55:33.303839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:34 crc kubenswrapper[4835]: E1002 10:55:34.363741 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 10:55:35 crc kubenswrapper[4835]: I1002 10:55:35.157585 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 02 10:55:35 crc kubenswrapper[4835]: I1002 10:55:35.157897 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:35 crc kubenswrapper[4835]: I1002 10:55:35.160319 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:35 crc kubenswrapper[4835]: I1002 10:55:35.160408 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:35 crc kubenswrapper[4835]: I1002 10:55:35.160458 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:37 crc kubenswrapper[4835]: W1002 10:55:37.938785 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 02 10:55:37 crc kubenswrapper[4835]: I1002 10:55:37.938911 4835 trace.go:236] Trace[171204305]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 10:55:27.936) (total time: 10001ms): Oct 02 10:55:37 crc kubenswrapper[4835]: Trace[171204305]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (10:55:37.938) Oct 02 10:55:37 crc kubenswrapper[4835]: Trace[171204305]: [10.001876422s] [10.001876422s] END Oct 02 10:55:37 crc kubenswrapper[4835]: E1002 10:55:37.938945 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 02 10:55:38 crc kubenswrapper[4835]: I1002 10:55:38.189143 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 02 10:55:38 crc kubenswrapper[4835]: W1002 10:55:38.258881 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 02 10:55:38 crc kubenswrapper[4835]: I1002 10:55:38.259032 4835 trace.go:236] Trace[1145152825]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 10:55:28.257) (total time: 10001ms): Oct 02 10:55:38 crc kubenswrapper[4835]: Trace[1145152825]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (10:55:38.258) Oct 02 10:55:38 crc kubenswrapper[4835]: Trace[1145152825]: [10.001430826s] [10.001430826s] END Oct 02 10:55:38 crc kubenswrapper[4835]: E1002 10:55:38.259061 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 02 10:55:38 crc kubenswrapper[4835]: I1002 10:55:38.365191 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 10:55:38 crc kubenswrapper[4835]: I1002 10:55:38.367573 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="730423b061cd09f46f64bd681d4abda7405af7a028fed8d7a46d1a6d696d4d13" exitCode=255 Oct 02 10:55:38 crc kubenswrapper[4835]: I1002 10:55:38.367605 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"730423b061cd09f46f64bd681d4abda7405af7a028fed8d7a46d1a6d696d4d13"} Oct 02 10:55:38 crc kubenswrapper[4835]: I1002 10:55:38.367839 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:38 crc kubenswrapper[4835]: I1002 10:55:38.368919 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:38 crc kubenswrapper[4835]: I1002 10:55:38.368957 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:38 crc kubenswrapper[4835]: I1002 10:55:38.368968 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:38 crc kubenswrapper[4835]: I1002 10:55:38.369562 4835 scope.go:117] "RemoveContainer" containerID="730423b061cd09f46f64bd681d4abda7405af7a028fed8d7a46d1a6d696d4d13" Oct 02 10:55:39 crc kubenswrapper[4835]: I1002 10:55:39.004643 4835 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 10:55:39 crc kubenswrapper[4835]: I1002 10:55:39.004721 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 10:55:39 crc kubenswrapper[4835]: I1002 10:55:39.009962 4835 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 10:55:39 crc kubenswrapper[4835]: I1002 10:55:39.010050 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 10:55:39 crc kubenswrapper[4835]: I1002 10:55:39.372013 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 10:55:39 crc kubenswrapper[4835]: I1002 10:55:39.374434 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef"} Oct 02 10:55:39 crc kubenswrapper[4835]: I1002 10:55:39.374772 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:39 crc kubenswrapper[4835]: I1002 10:55:39.376389 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:39 crc kubenswrapper[4835]: I1002 10:55:39.376419 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:39 crc kubenswrapper[4835]: I1002 10:55:39.376431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:41 crc kubenswrapper[4835]: I1002 10:55:41.205396 4835 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 10:55:41 crc kubenswrapper[4835]: I1002 10:55:41.206109 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 10:55:41 crc kubenswrapper[4835]: I1002 10:55:41.548562 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:41 crc kubenswrapper[4835]: I1002 10:55:41.548774 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:41 crc kubenswrapper[4835]: I1002 10:55:41.550649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:41 crc kubenswrapper[4835]: I1002 10:55:41.550863 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:41 crc kubenswrapper[4835]: I1002 10:55:41.551040 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.041966 4835 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.051508 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.051776 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.053545 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.053596 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.053610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.213292 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.213531 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.215082 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.215143 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.215158 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.231936 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.382512 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.383929 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.383987 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:42 crc kubenswrapper[4835]: I1002 10:55:42.384001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:43 crc kubenswrapper[4835]: I1002 10:55:43.081666 4835 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 02 10:55:43 crc kubenswrapper[4835]: I1002 10:55:43.309027 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:43 crc kubenswrapper[4835]: I1002 10:55:43.309293 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:43 crc kubenswrapper[4835]: I1002 10:55:43.310858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:43 crc kubenswrapper[4835]: I1002 10:55:43.310911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:43 crc kubenswrapper[4835]: I1002 10:55:43.310930 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:43 crc kubenswrapper[4835]: I1002 10:55:43.314528 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:43 crc kubenswrapper[4835]: I1002 10:55:43.384621 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:55:43 crc kubenswrapper[4835]: I1002 10:55:43.385569 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:43 crc kubenswrapper[4835]: I1002 10:55:43.385614 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:43 crc kubenswrapper[4835]: I1002 10:55:43.385629 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:43 crc kubenswrapper[4835]: E1002 10:55:43.999203 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.003745 4835 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.003810 4835 trace.go:236] Trace[982644749]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 10:55:30.329) (total time: 13673ms): Oct 02 10:55:44 crc kubenswrapper[4835]: Trace[982644749]: ---"Objects listed" error: 13673ms (10:55:44.003) Oct 02 10:55:44 crc kubenswrapper[4835]: Trace[982644749]: [13.673938045s] [13.673938045s] END Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.003839 4835 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.006703 4835 trace.go:236] Trace[249068924]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 10:55:30.368) (total time: 13638ms): Oct 02 10:55:44 crc kubenswrapper[4835]: Trace[249068924]: ---"Objects listed" error: 13638ms (10:55:44.006) Oct 02 10:55:44 crc kubenswrapper[4835]: Trace[249068924]: [13.638341589s] [13.638341589s] END Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.006764 4835 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.011147 4835 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.011509 4835 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.012843 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.012870 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.012880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.012901 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.012912 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.026621 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.031394 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.031448 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.031460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.031488 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.031502 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.041154 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.051339 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.051412 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.051432 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.051459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.051473 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.067885 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.072893 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.072947 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.072962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.072988 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.073001 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.082286 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.086833 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.086884 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.086898 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.086926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.086939 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.098269 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.098411 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.100837 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.100887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.100901 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.100925 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.100942 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.177581 4835 apiserver.go:52] "Watching apiserver" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.181295 4835 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.181619 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.182052 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.182112 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.182160 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.182262 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.182366 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.182587 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.182610 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.182619 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.182664 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.184550 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.184613 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.184646 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.184881 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.184993 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.185074 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.185536 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.187918 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.187996 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.201881 4835 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.203321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.203371 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.203387 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.203410 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.203427 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.204715 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.204764 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.204785 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.204824 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.204843 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.204862 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.204883 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.204927 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.204950 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.204969 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.204988 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205004 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205021 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205037 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205054 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205107 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205126 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205200 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205265 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205357 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205378 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205396 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205414 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205432 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205449 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205471 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205497 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205523 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205539 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205557 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205534 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.205574 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.207166 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.207165 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.207042 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.207204 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.207337 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.207352 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.207376 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.207433 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.207939 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.207989 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.208016 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.208771 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.209020 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.209264 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.209652 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.209872 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.210046 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.210297 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.210582 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.211512 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.211590 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.211647 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.211997 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217062 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217426 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217416 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217568 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217637 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217709 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217722 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217732 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217768 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217931 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217980 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.217983 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218020 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218165 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218290 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218328 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218349 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218351 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218437 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218357 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218517 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218551 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218575 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218596 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218621 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.218646 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.231763 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234487 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234613 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234651 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234675 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234702 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234724 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234748 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234772 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234792 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234813 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234831 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234850 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234871 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234893 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234917 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234934 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234952 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234973 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235006 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235024 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235045 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235067 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235088 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235111 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235132 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235159 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235184 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235204 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235240 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235263 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235283 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235370 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235392 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235416 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235437 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235456 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235477 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235498 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235517 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235540 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235559 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235579 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235605 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235629 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235665 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235688 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235710 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235731 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235750 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235769 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235789 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235806 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235825 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235845 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235886 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235907 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235927 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235947 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235967 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236000 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236020 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236041 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236061 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236086 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236104 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236123 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236142 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236166 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236187 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236232 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236253 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236275 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236312 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236330 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236347 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236364 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236382 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236400 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236424 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236442 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236460 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236494 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236519 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236538 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236558 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236577 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236612 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.234643 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235576 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.235895 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236159 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.267963 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236383 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236491 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236615 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236846 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.237010 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.237012 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.237156 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.237190 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.237308 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.237638 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.237682 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.237769 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:55:44.737741928 +0000 UTC m=+21.297649509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.237836 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.237881 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.237892 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.222313 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.237915 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.238078 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.240179 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.255214 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.257610 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.257770 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.257905 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.258133 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.258577 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.258918 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.229455 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.259054 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.259129 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.259245 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.261242 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.261709 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.262000 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.262270 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.262287 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.265523 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.266060 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.266061 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.266363 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.267701 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.267749 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.267800 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.268193 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.268354 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.268601 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.268753 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.269574 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.269620 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.270039 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.270122 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.270449 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.270416 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.270690 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.270806 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.270914 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.271118 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.271112 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.271631 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.271963 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.272078 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.272422 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.272463 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.272505 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.272529 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.273134 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.273464 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.273630 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.274189 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.274328 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.274557 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.275813 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.276010 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.276455 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.276753 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.276808 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.277104 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.277176 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.277260 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.236633 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.278214 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.278265 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.278291 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.278312 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.278337 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.279563 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.279650 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.279722 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280357 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280400 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.279961 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280429 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280444 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280467 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280492 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280514 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280536 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280557 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.279978 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280192 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280741 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.280940 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.281070 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.281330 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.282542 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.282677 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.282838 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283198 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283491 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283570 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283608 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283630 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283651 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283670 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283687 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283705 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283723 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283741 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283759 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283778 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283800 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283824 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283841 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283860 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283879 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.283898 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.284619 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.284678 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.284745 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.284783 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.284898 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.284999 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285052 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285094 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285142 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285172 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285203 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285250 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285281 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285308 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285366 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285415 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285459 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285503 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285535 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285561 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285588 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285665 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285658 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285870 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285916 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285946 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.286144 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.286194 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.286256 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.286296 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.284515 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.284516 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.284851 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.284869 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285401 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285603 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285809 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.285980 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.286200 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.286704 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.286713 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.286908 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.287262 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.287430 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.287472 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.287544 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.287787 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.287814 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.287865 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.288034 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.288247 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.288504 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.288520 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.288559 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.288783 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.288810 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.288954 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.289050 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.289153 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.289189 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.289352 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.289380 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.292648 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.292811 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.292848 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.292894 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.292902 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.292921 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.293111 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.293149 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.293173 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.293250 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.293280 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.293425 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.293488 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:44.79346693 +0000 UTC m=+21.353374511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.293616 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.293910 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.294077 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.294139 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.294210 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.294350 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.294351 4835 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.294417 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.294926 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.294750 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.295117 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.295139 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.295352 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.295496 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.295531 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.295807 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:44.795772766 +0000 UTC m=+21.355680347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.295842 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.295854 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.296081 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.295930 4835 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.296142 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.296989 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297280 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297311 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297360 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297376 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297390 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297404 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297445 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297459 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297472 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297485 4835 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297520 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297550 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297618 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297636 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297647 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297657 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297203 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297668 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297722 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297761 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297776 4835 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297790 4835 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297796 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297806 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297845 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297858 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297871 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297883 4835 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297918 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297932 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297947 4835 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297961 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.297994 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298009 4835 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298021 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298034 4835 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298045 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298080 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298094 4835 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298106 4835 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298118 4835 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298153 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298168 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298181 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298179 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298194 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298330 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298354 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298368 4835 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298427 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298482 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298494 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298509 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298524 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298537 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298549 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298560 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298574 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298595 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298624 4835 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298638 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298660 4835 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298719 4835 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298734 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298792 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298809 4835 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298834 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298856 4835 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298869 4835 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298881 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298894 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298907 4835 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298919 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298931 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298939 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298921 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298971 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.298946 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299024 4835 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299042 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299055 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299071 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299085 4835 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299098 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299113 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299125 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299138 4835 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299151 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299163 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299176 4835 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299189 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299204 4835 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299390 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299408 4835 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299432 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299446 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299459 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299474 4835 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299489 4835 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299506 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299519 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299531 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299551 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299565 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299658 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299685 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299700 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299716 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299728 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299744 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299731 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299757 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299847 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.299868 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300798 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300839 4835 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300854 4835 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300865 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300875 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300884 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300894 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300903 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300914 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300924 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300933 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300944 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300954 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300964 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300973 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300982 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.300992 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.301002 4835 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.301012 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.301020 4835 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.301031 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.301042 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.301051 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.301203 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.301312 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.301509 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.302110 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.303484 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.304217 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.305017 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.305081 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.305594 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.306320 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.306352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.306362 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.306402 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.306416 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.307174 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.307752 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.308344 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.309076 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.309819 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.310329 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.311791 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.312547 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.313133 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.315551 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.315955 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.316009 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.316676 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.316765 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.316841 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.316903 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.316710 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.317039 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.317067 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.316952 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:44.816932644 +0000 UTC m=+21.376840225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.317240 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:44.817227873 +0000 UTC m=+21.377135454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.317440 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.317611 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.318319 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.320150 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.321414 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.321573 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.321773 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.322564 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.323136 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.324746 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.328058 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.330575 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.334643 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.335059 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.335270 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.335584 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.336887 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.337596 4835 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.337716 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.339835 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.340768 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.341185 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.343174 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.344359 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.344793 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.344867 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.345028 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.345783 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.347040 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.348506 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.349726 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.350696 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.351963 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.352663 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.353817 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.354483 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.355872 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.357085 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.358837 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.359757 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.360335 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.361339 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.363429 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.364570 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.372159 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.385778 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402160 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402249 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402299 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402319 4835 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402332 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402346 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402357 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402367 4835 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402349 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402453 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402377 4835 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402580 4835 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402601 4835 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402618 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402631 4835 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402699 4835 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402744 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402760 4835 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402783 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402799 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402818 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402835 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402851 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402865 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402879 4835 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402893 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402908 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402922 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402938 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402954 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402968 4835 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402983 4835 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.402997 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403011 4835 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403026 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403041 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403015 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403053 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403249 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403261 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403272 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403282 4835 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403292 4835 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403301 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403313 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403323 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403333 4835 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403344 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403353 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403362 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403371 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403382 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403403 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403418 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403434 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403445 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403457 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403468 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403481 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403494 4835 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403506 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403519 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403531 4835 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403540 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403549 4835 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403559 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403568 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403578 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.403588 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.409245 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.409271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.409280 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.409296 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.409307 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.415981 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.428651 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.442979 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.461578 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.498735 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.505607 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.514925 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.515424 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.515437 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.515517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.515553 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.549346 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.556692 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-nzxcq"] Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.557284 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nzxcq" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.558777 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.559560 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.559961 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.574725 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.587618 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.602039 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.605699 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/18b2bc9d-d549-47f8-a503-27f19e3b0889-hosts-file\") pod \"node-resolver-nzxcq\" (UID: \"18b2bc9d-d549-47f8-a503-27f19e3b0889\") " pod="openshift-dns/node-resolver-nzxcq" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.605755 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2lb5\" (UniqueName: \"kubernetes.io/projected/18b2bc9d-d549-47f8-a503-27f19e3b0889-kube-api-access-h2lb5\") pod \"node-resolver-nzxcq\" (UID: \"18b2bc9d-d549-47f8-a503-27f19e3b0889\") " pod="openshift-dns/node-resolver-nzxcq" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.616070 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.618345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.618405 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.618418 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.618436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.618446 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.627336 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.644977 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.655980 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.706188 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2lb5\" (UniqueName: \"kubernetes.io/projected/18b2bc9d-d549-47f8-a503-27f19e3b0889-kube-api-access-h2lb5\") pod \"node-resolver-nzxcq\" (UID: \"18b2bc9d-d549-47f8-a503-27f19e3b0889\") " pod="openshift-dns/node-resolver-nzxcq" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.706283 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/18b2bc9d-d549-47f8-a503-27f19e3b0889-hosts-file\") pod \"node-resolver-nzxcq\" (UID: \"18b2bc9d-d549-47f8-a503-27f19e3b0889\") " pod="openshift-dns/node-resolver-nzxcq" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.706408 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/18b2bc9d-d549-47f8-a503-27f19e3b0889-hosts-file\") pod \"node-resolver-nzxcq\" (UID: \"18b2bc9d-d549-47f8-a503-27f19e3b0889\") " pod="openshift-dns/node-resolver-nzxcq" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.720675 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.720720 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.720732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.720750 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.720765 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.727945 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2lb5\" (UniqueName: \"kubernetes.io/projected/18b2bc9d-d549-47f8-a503-27f19e3b0889-kube-api-access-h2lb5\") pod \"node-resolver-nzxcq\" (UID: \"18b2bc9d-d549-47f8-a503-27f19e3b0889\") " pod="openshift-dns/node-resolver-nzxcq" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.806806 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.806960 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.806984 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:55:45.806954071 +0000 UTC m=+22.366861662 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.807026 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.807150 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.807151 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.807262 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:45.807213419 +0000 UTC m=+22.367121020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.807289 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:45.807278611 +0000 UTC m=+22.367186202 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.823619 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.823692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.823707 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.823730 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.823745 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.891518 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nzxcq" Oct 02 10:55:44 crc kubenswrapper[4835]: W1002 10:55:44.903727 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18b2bc9d_d549_47f8_a503_27f19e3b0889.slice/crio-80e205d996500f97832bc78cf169b494bc87205c8a9f0eb70ec46efb30c506ee WatchSource:0}: Error finding container 80e205d996500f97832bc78cf169b494bc87205c8a9f0eb70ec46efb30c506ee: Status 404 returned error can't find the container with id 80e205d996500f97832bc78cf169b494bc87205c8a9f0eb70ec46efb30c506ee Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.908201 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.908272 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.908363 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.908390 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.908404 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.908370 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.908476 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.908488 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.908459 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:45.908441019 +0000 UTC m=+22.468348600 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:44 crc kubenswrapper[4835]: E1002 10:55:44.908559 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:45.908527211 +0000 UTC m=+22.468434792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.926188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.926274 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.926288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.926312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:44 crc kubenswrapper[4835]: I1002 10:55:44.926329 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:44Z","lastTransitionTime":"2025-10-02T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.029819 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.029879 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.029894 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.029926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.029943 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:45Z","lastTransitionTime":"2025-10-02T10:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.133106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.133145 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.133156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.133175 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.133187 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:45Z","lastTransitionTime":"2025-10-02T10:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.245375 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.245416 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.245425 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.245440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.245453 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:45Z","lastTransitionTime":"2025-10-02T10:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.347436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.347500 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.347514 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.347535 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.347546 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:45Z","lastTransitionTime":"2025-10-02T10:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.392184 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.392929 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.395150 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef" exitCode=255 Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.395249 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.395499 4835 scope.go:117] "RemoveContainer" containerID="730423b061cd09f46f64bd681d4abda7405af7a028fed8d7a46d1a6d696d4d13" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.397096 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nzxcq" event={"ID":"18b2bc9d-d549-47f8-a503-27f19e3b0889","Type":"ContainerStarted","Data":"32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.397133 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nzxcq" event={"ID":"18b2bc9d-d549-47f8-a503-27f19e3b0889","Type":"ContainerStarted","Data":"80e205d996500f97832bc78cf169b494bc87205c8a9f0eb70ec46efb30c506ee"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.403294 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c0520e2582b4a89c79852f2ca1a1044031cff1ebd0e424ed032057b713cc45e5"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.404688 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.404896 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8eba51140f75fa91819a7628489b558d28aec094a1885b50f103b3558d594ff4"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.406057 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.406111 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.406122 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8c965347a77823b56d38098e25b0b70c78351dc950d6ff4240ff282ceb731917"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.413110 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.425972 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.441601 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.450965 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.451014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.451027 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.451048 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.451061 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:45Z","lastTransitionTime":"2025-10-02T10:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.454687 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.469766 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.482026 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.491695 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.492315 4835 scope.go:117] "RemoveContainer" containerID="aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef" Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.492513 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.499074 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.513493 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.524713 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.536901 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.548872 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.553252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.553317 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.553335 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.553361 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.553376 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:45Z","lastTransitionTime":"2025-10-02T10:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.563511 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.580085 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://730423b061cd09f46f64bd681d4abda7405af7a028fed8d7a46d1a6d696d4d13\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:38Z\\\",\\\"message\\\":\\\"W1002 10:55:27.451493 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 10:55:27.451810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759402527 cert, and key in /tmp/serving-cert-560570698/serving-signer.crt, /tmp/serving-cert-560570698/serving-signer.key\\\\nI1002 10:55:27.692654 1 observer_polling.go:159] Starting file observer\\\\nW1002 10:55:27.695943 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 10:55:27.696146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:27.698512 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-560570698/tls.crt::/tmp/serving-cert-560570698/tls.key\\\\\\\"\\\\nF1002 10:55:38.043312 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.607562 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.624631 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.655479 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.655521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.655531 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.655549 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.655559 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:45Z","lastTransitionTime":"2025-10-02T10:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.758257 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.758298 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.758310 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.758326 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.758336 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:45Z","lastTransitionTime":"2025-10-02T10:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.779355 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-2tw4v"] Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.779708 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.780798 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-bjtqm"] Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.781273 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.781727 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.781865 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.783497 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-79zgl"] Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.784179 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.784519 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.784666 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.784799 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.785552 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.785771 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-5ckb9"] Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.785995 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.786209 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.787330 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.787595 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.787828 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.788075 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.788961 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.791536 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.791651 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.791815 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.791881 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.791885 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.792160 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.793739 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.802185 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.820626 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.820735 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-slash\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.820764 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-run-k8s-cni-cncf-io\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.820788 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-cni-binary-copy\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.820817 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-systemd\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.820860 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:55:47.82082072 +0000 UTC m=+24.380728301 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.820925 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-env-overrides\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821047 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-run-netns\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821130 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-daemon-config\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821185 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-mcd-auth-proxy-config\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821217 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cea2edfd-8b9c-44be-be9a-d2feb410da71-cni-binary-copy\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821278 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-etc-kubernetes\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821418 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-cni-dir\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821490 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-ovn-kubernetes\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821519 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-bin\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821541 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsbdj\" (UniqueName: \"kubernetes.io/projected/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-kube-api-access-fsbdj\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821574 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821603 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821626 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-os-release\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821646 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-socket-dir-parent\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821666 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-os-release\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821684 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-etc-openvswitch\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821715 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821740 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-script-lib\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.821742 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821758 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-hostroot\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.821764 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.821809 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:47.821796068 +0000 UTC m=+24.381703769 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821837 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-cnibin\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.821873 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:47.821847519 +0000 UTC m=+24.381755100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821910 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-node-log\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821931 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-cnibin\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821955 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-kubelet\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821972 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-netns\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.821989 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-log-socket\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822007 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-var-lib-cni-bin\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822055 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-conf-dir\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822101 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btz7x\" (UniqueName: \"kubernetes.io/projected/cea2edfd-8b9c-44be-be9a-d2feb410da71-kube-api-access-btz7x\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822147 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822246 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-var-lib-kubelet\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822271 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-run-multus-certs\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822294 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-systemd-units\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822157 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://730423b061cd09f46f64bd681d4abda7405af7a028fed8d7a46d1a6d696d4d13\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:38Z\\\",\\\"message\\\":\\\"W1002 10:55:27.451493 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 10:55:27.451810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759402527 cert, and key in /tmp/serving-cert-560570698/serving-signer.crt, /tmp/serving-cert-560570698/serving-signer.key\\\\nI1002 10:55:27.692654 1 observer_polling.go:159] Starting file observer\\\\nW1002 10:55:27.695943 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 10:55:27.696146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:27.698512 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-560570698/tls.crt::/tmp/serving-cert-560570698/tls.key\\\\\\\"\\\\nF1002 10:55:38.043312 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822315 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-ovn\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822450 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-netd\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822499 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-system-cni-dir\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822527 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-openvswitch\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822565 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovn-node-metrics-cert\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822591 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-proxy-tls\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822617 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpt25\" (UniqueName: \"kubernetes.io/projected/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-kube-api-access-lpt25\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822643 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-var-lib-openvswitch\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822663 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-config\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822692 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-rootfs\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822719 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-system-cni-dir\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822743 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-var-lib-cni-multus\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822779 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.822826 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6cwt\" (UniqueName: \"kubernetes.io/projected/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-kube-api-access-s6cwt\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.841768 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.858425 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.860918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.860972 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.860989 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.861019 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.861036 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:45Z","lastTransitionTime":"2025-10-02T10:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.899081 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923712 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-cni-dir\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923757 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-ovn-kubernetes\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923780 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-bin\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923807 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsbdj\" (UniqueName: \"kubernetes.io/projected/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-kube-api-access-fsbdj\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923861 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-os-release\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923884 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-socket-dir-parent\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923901 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-os-release\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923918 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-etc-openvswitch\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923935 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923926 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-ovn-kubernetes\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923992 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-bin\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.923954 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-script-lib\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924111 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-cni-dir\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924159 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-hostroot\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924131 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-hostroot\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924249 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-cnibin\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924274 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-node-log\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924276 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-os-release\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924293 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-cnibin\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924322 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-kubelet\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924333 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-etc-openvswitch\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924343 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-netns\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924337 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-os-release\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924364 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-log-socket\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924386 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-socket-dir-parent\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924438 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924475 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-var-lib-cni-bin\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924481 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-cnibin\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924503 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-conf-dir\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924523 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-cnibin\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924532 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btz7x\" (UniqueName: \"kubernetes.io/projected/cea2edfd-8b9c-44be-be9a-d2feb410da71-kube-api-access-btz7x\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924545 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-log-socket\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924558 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924611 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-var-lib-kubelet\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924636 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-run-multus-certs\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924662 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-systemd-units\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924686 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-ovn\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924708 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-netd\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924724 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-conf-dir\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924756 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-run-multus-certs\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924772 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924793 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-systemd-units\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924799 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-script-lib\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924815 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-system-cni-dir\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924795 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-system-cni-dir\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924829 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-ovn\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924843 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-netd\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924861 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-openvswitch\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924880 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-netns\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924890 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovn-node-metrics-cert\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924908 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-proxy-tls\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924911 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-var-lib-cni-bin\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924926 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpt25\" (UniqueName: \"kubernetes.io/projected/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-kube-api-access-lpt25\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.924938 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.924960 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.924973 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924983 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-kubelet\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.924984 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-openvswitch\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.925020 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:47.924999235 +0000 UTC m=+24.484906816 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925025 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-node-log\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925051 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-var-lib-kubelet\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925260 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-var-lib-openvswitch\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925293 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-config\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925425 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-var-lib-openvswitch\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925456 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-rootfs\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925539 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-system-cni-dir\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925563 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-var-lib-cni-multus\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925607 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925628 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6cwt\" (UniqueName: \"kubernetes.io/projected/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-kube-api-access-s6cwt\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925636 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-system-cni-dir\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925677 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-rootfs\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925701 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-var-lib-cni-multus\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925707 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925775 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-slash\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925828 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-run-k8s-cni-cncf-io\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925862 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-slash\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925904 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-run-k8s-cni-cncf-io\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.925958 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-cni-binary-copy\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926052 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926120 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-config\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926162 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-systemd\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.926193 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.926234 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.926249 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926252 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-systemd\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926214 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-env-overrides\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926294 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-run-netns\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926264 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: E1002 10:55:45.926336 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:47.926322273 +0000 UTC m=+24.486229854 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926363 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-daemon-config\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926376 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-host-run-netns\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926593 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-mcd-auth-proxy-config\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926620 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-env-overrides\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926639 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cea2edfd-8b9c-44be-be9a-d2feb410da71-cni-binary-copy\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926744 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-etc-kubernetes\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926769 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-cni-binary-copy\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.926832 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cea2edfd-8b9c-44be-be9a-d2feb410da71-etc-kubernetes\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.927053 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cea2edfd-8b9c-44be-be9a-d2feb410da71-multus-daemon-config\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.927252 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-mcd-auth-proxy-config\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.927367 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cea2edfd-8b9c-44be-be9a-d2feb410da71-cni-binary-copy\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.931751 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovn-node-metrics-cert\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.942187 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-proxy-tls\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.947311 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.965274 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btz7x\" (UniqueName: \"kubernetes.io/projected/cea2edfd-8b9c-44be-be9a-d2feb410da71-kube-api-access-btz7x\") pod \"multus-2tw4v\" (UID: \"cea2edfd-8b9c-44be-be9a-d2feb410da71\") " pod="openshift-multus/multus-2tw4v" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.974077 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.974157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.974169 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.974209 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.974249 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:45Z","lastTransitionTime":"2025-10-02T10:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.978994 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsbdj\" (UniqueName: \"kubernetes.io/projected/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-kube-api-access-fsbdj\") pod \"ovnkube-node-79zgl\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.986130 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpt25\" (UniqueName: \"kubernetes.io/projected/ce0ad186-63b7-432a-a0ca-4d4cbde057a8-kube-api-access-lpt25\") pod \"machine-config-daemon-5ckb9\" (UID: \"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\") " pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.989082 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:45 crc kubenswrapper[4835]: I1002 10:55:45.992038 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6cwt\" (UniqueName: \"kubernetes.io/projected/e295ff08-63dc-4638-8fb6-6ee6b07ccaa0-kube-api-access-s6cwt\") pod \"multus-additional-cni-plugins-bjtqm\" (UID: \"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\") " pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.015302 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.038459 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.054719 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.069602 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.077574 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.077619 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.077634 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.077654 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.077667 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:46Z","lastTransitionTime":"2025-10-02T10:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.086935 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.095568 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2tw4v" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.101596 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.102695 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:55:46 crc kubenswrapper[4835]: W1002 10:55:46.106970 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcea2edfd_8b9c_44be_be9a_d2feb410da71.slice/crio-d4c1e1c12d75bd36d50a2361c62c601fcabb3e3ff8bb594a5400a30ad72eef80 WatchSource:0}: Error finding container d4c1e1c12d75bd36d50a2361c62c601fcabb3e3ff8bb594a5400a30ad72eef80: Status 404 returned error can't find the container with id d4c1e1c12d75bd36d50a2361c62c601fcabb3e3ff8bb594a5400a30ad72eef80 Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.110741 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.116678 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:46 crc kubenswrapper[4835]: W1002 10:55:46.121374 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0ad186_63b7_432a_a0ca_4d4cbde057a8.slice/crio-95a305a308dfc62d4918ef9168f340137b3a6598374af4f66fa8c79701d101b8 WatchSource:0}: Error finding container 95a305a308dfc62d4918ef9168f340137b3a6598374af4f66fa8c79701d101b8: Status 404 returned error can't find the container with id 95a305a308dfc62d4918ef9168f340137b3a6598374af4f66fa8c79701d101b8 Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.125015 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://730423b061cd09f46f64bd681d4abda7405af7a028fed8d7a46d1a6d696d4d13\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:38Z\\\",\\\"message\\\":\\\"W1002 10:55:27.451493 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 10:55:27.451810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759402527 cert, and key in /tmp/serving-cert-560570698/serving-signer.crt, /tmp/serving-cert-560570698/serving-signer.key\\\\nI1002 10:55:27.692654 1 observer_polling.go:159] Starting file observer\\\\nW1002 10:55:27.695943 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 10:55:27.696146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:27.698512 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-560570698/tls.crt::/tmp/serving-cert-560570698/tls.key\\\\\\\"\\\\nF1002 10:55:38.043312 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: W1002 10:55:46.138369 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode295ff08_63dc_4638_8fb6_6ee6b07ccaa0.slice/crio-159d4e22fb1bc5a476b200b5589e1e1e7e1b9cad528cf98c6cb34b3e87ba244a WatchSource:0}: Error finding container 159d4e22fb1bc5a476b200b5589e1e1e7e1b9cad528cf98c6cb34b3e87ba244a: Status 404 returned error can't find the container with id 159d4e22fb1bc5a476b200b5589e1e1e7e1b9cad528cf98c6cb34b3e87ba244a Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.138930 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.153688 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.168436 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.189192 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.192862 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.192905 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.192917 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.192936 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.192948 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:46Z","lastTransitionTime":"2025-10-02T10:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.209600 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.227897 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.251049 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:46 crc kubenswrapper[4835]: E1002 10:55:46.251162 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.251247 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:46 crc kubenswrapper[4835]: E1002 10:55:46.251295 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.251347 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:46 crc kubenswrapper[4835]: E1002 10:55:46.251390 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.254770 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.257799 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.258787 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.259559 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.260915 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.261684 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.262687 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.263373 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.264738 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.265331 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.297709 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.297737 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.297746 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.297764 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.297777 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:46Z","lastTransitionTime":"2025-10-02T10:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.399745 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.399797 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.399808 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.399835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.399860 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:46Z","lastTransitionTime":"2025-10-02T10:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.410188 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803" exitCode=0 Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.410281 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.410359 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"cd8fd04bf7f63162a8b7df6748366eb034afcb8e5577d5632c8151c20137d1d4"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.411905 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" event={"ID":"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0","Type":"ContainerStarted","Data":"1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.411936 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" event={"ID":"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0","Type":"ContainerStarted","Data":"159d4e22fb1bc5a476b200b5589e1e1e7e1b9cad528cf98c6cb34b3e87ba244a"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.414427 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.414511 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.414531 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"95a305a308dfc62d4918ef9168f340137b3a6598374af4f66fa8c79701d101b8"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.422663 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2tw4v" event={"ID":"cea2edfd-8b9c-44be-be9a-d2feb410da71","Type":"ContainerStarted","Data":"48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.422710 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2tw4v" event={"ID":"cea2edfd-8b9c-44be-be9a-d2feb410da71","Type":"ContainerStarted","Data":"d4c1e1c12d75bd36d50a2361c62c601fcabb3e3ff8bb594a5400a30ad72eef80"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.424736 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.426581 4835 scope.go:117] "RemoveContainer" containerID="aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef" Oct 02 10:55:46 crc kubenswrapper[4835]: E1002 10:55:46.426719 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.428639 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.464352 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.477894 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.491922 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.502924 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.502975 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.502986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.503004 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.503015 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:46Z","lastTransitionTime":"2025-10-02T10:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.505118 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.518824 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.534696 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://730423b061cd09f46f64bd681d4abda7405af7a028fed8d7a46d1a6d696d4d13\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:38Z\\\",\\\"message\\\":\\\"W1002 10:55:27.451493 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 10:55:27.451810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759402527 cert, and key in /tmp/serving-cert-560570698/serving-signer.crt, /tmp/serving-cert-560570698/serving-signer.key\\\\nI1002 10:55:27.692654 1 observer_polling.go:159] Starting file observer\\\\nW1002 10:55:27.695943 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 10:55:27.696146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:27.698512 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-560570698/tls.crt::/tmp/serving-cert-560570698/tls.key\\\\\\\"\\\\nF1002 10:55:38.043312 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.551488 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.572955 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.592709 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.605194 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.605272 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.605289 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.605312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.605327 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:46Z","lastTransitionTime":"2025-10-02T10:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.610781 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.626638 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.641371 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.653293 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.666934 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.688335 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.706081 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.707948 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.707997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.708010 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.708030 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.708040 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:46Z","lastTransitionTime":"2025-10-02T10:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.724570 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.738100 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.753468 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.773082 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.788692 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.807256 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.811100 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.811149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.811164 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.811184 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.811196 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:46Z","lastTransitionTime":"2025-10-02T10:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.828856 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.913346 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.913392 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.913405 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.913424 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:46 crc kubenswrapper[4835]: I1002 10:55:46.913437 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:46Z","lastTransitionTime":"2025-10-02T10:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.015549 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.015613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.015625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.015649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.015664 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:47Z","lastTransitionTime":"2025-10-02T10:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.123630 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.123689 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.123702 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.123724 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.123736 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:47Z","lastTransitionTime":"2025-10-02T10:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.226131 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.226171 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.226180 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.226200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.226211 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:47Z","lastTransitionTime":"2025-10-02T10:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.329132 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.329184 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.329194 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.329212 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.329241 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:47Z","lastTransitionTime":"2025-10-02T10:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.431361 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.431413 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.431427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.431445 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.431460 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:47Z","lastTransitionTime":"2025-10-02T10:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.432657 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.432699 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.432716 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.432729 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.434658 4835 generic.go:334] "Generic (PLEG): container finished" podID="e295ff08-63dc-4638-8fb6-6ee6b07ccaa0" containerID="1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a" exitCode=0 Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.434713 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" event={"ID":"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0","Type":"ContainerDied","Data":"1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.435911 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.455779 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.468653 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.487507 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.508449 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.528127 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.533653 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.533716 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.533733 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.533753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.533785 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:47Z","lastTransitionTime":"2025-10-02T10:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.541548 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.555332 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.577172 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.590478 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.608610 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.626009 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.636578 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.636627 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.636638 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.636656 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.636671 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:47Z","lastTransitionTime":"2025-10-02T10:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.641577 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.655265 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.668890 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.683158 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.698735 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.711371 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.724915 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.739462 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.739494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.739503 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.739523 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.739534 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:47Z","lastTransitionTime":"2025-10-02T10:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.739860 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.754414 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.768378 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.784278 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.802198 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.819141 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:47Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.842784 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.842813 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.842825 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.842941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.842953 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:55:51.842929355 +0000 UTC m=+28.402836946 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.842969 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.842985 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:47Z","lastTransitionTime":"2025-10-02T10:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.843074 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.843112 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.843247 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.843257 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.843289 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:51.843280005 +0000 UTC m=+28.403187666 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.843318 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:51.843300005 +0000 UTC m=+28.403207586 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.944243 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.944328 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.944388 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.944405 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.944414 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.944417 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.944427 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.944429 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.944478 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:51.944460004 +0000 UTC m=+28.504367595 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:47 crc kubenswrapper[4835]: E1002 10:55:47.944496 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:51.944489144 +0000 UTC m=+28.504396725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.946026 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.946058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.946067 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.946082 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:47 crc kubenswrapper[4835]: I1002 10:55:47.946093 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:47Z","lastTransitionTime":"2025-10-02T10:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.048661 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.048702 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.048711 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.048728 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.048737 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:48Z","lastTransitionTime":"2025-10-02T10:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.096141 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bpzpk"] Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.096594 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bpzpk" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.098725 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.099478 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.100095 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.100153 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.112929 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.125722 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.146672 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/148f7288-2984-4fd9-8d43-9ee90fb4adaf-host\") pod \"node-ca-bpzpk\" (UID: \"148f7288-2984-4fd9-8d43-9ee90fb4adaf\") " pod="openshift-image-registry/node-ca-bpzpk" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.146725 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/148f7288-2984-4fd9-8d43-9ee90fb4adaf-serviceca\") pod \"node-ca-bpzpk\" (UID: \"148f7288-2984-4fd9-8d43-9ee90fb4adaf\") " pod="openshift-image-registry/node-ca-bpzpk" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.146830 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br5m4\" (UniqueName: \"kubernetes.io/projected/148f7288-2984-4fd9-8d43-9ee90fb4adaf-kube-api-access-br5m4\") pod \"node-ca-bpzpk\" (UID: \"148f7288-2984-4fd9-8d43-9ee90fb4adaf\") " pod="openshift-image-registry/node-ca-bpzpk" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.147608 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.160525 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.164198 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.164252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.164264 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.164283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.164297 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:48Z","lastTransitionTime":"2025-10-02T10:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.174941 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.190474 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.203449 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.213083 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.216959 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.217449 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.223855 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.232011 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.243947 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.249270 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br5m4\" (UniqueName: \"kubernetes.io/projected/148f7288-2984-4fd9-8d43-9ee90fb4adaf-kube-api-access-br5m4\") pod \"node-ca-bpzpk\" (UID: \"148f7288-2984-4fd9-8d43-9ee90fb4adaf\") " pod="openshift-image-registry/node-ca-bpzpk" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.249356 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/148f7288-2984-4fd9-8d43-9ee90fb4adaf-host\") pod \"node-ca-bpzpk\" (UID: \"148f7288-2984-4fd9-8d43-9ee90fb4adaf\") " pod="openshift-image-registry/node-ca-bpzpk" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.249378 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/148f7288-2984-4fd9-8d43-9ee90fb4adaf-serviceca\") pod \"node-ca-bpzpk\" (UID: \"148f7288-2984-4fd9-8d43-9ee90fb4adaf\") " pod="openshift-image-registry/node-ca-bpzpk" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.249741 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/148f7288-2984-4fd9-8d43-9ee90fb4adaf-host\") pod \"node-ca-bpzpk\" (UID: \"148f7288-2984-4fd9-8d43-9ee90fb4adaf\") " pod="openshift-image-registry/node-ca-bpzpk" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.250782 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/148f7288-2984-4fd9-8d43-9ee90fb4adaf-serviceca\") pod \"node-ca-bpzpk\" (UID: \"148f7288-2984-4fd9-8d43-9ee90fb4adaf\") " pod="openshift-image-registry/node-ca-bpzpk" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.251147 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.251177 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.251253 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:48 crc kubenswrapper[4835]: E1002 10:55:48.251332 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:55:48 crc kubenswrapper[4835]: E1002 10:55:48.251409 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:55:48 crc kubenswrapper[4835]: E1002 10:55:48.251525 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.260755 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.267684 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.268430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.268444 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.268465 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.268468 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br5m4\" (UniqueName: \"kubernetes.io/projected/148f7288-2984-4fd9-8d43-9ee90fb4adaf-kube-api-access-br5m4\") pod \"node-ca-bpzpk\" (UID: \"148f7288-2984-4fd9-8d43-9ee90fb4adaf\") " pod="openshift-image-registry/node-ca-bpzpk" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.268477 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:48Z","lastTransitionTime":"2025-10-02T10:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.280932 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.307206 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.323147 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.337721 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.352650 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.364581 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.371445 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.371504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.371517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.371547 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.371561 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:48Z","lastTransitionTime":"2025-10-02T10:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.378384 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.393952 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.410983 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bpzpk" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.411965 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.432531 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.442037 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" event={"ID":"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0","Type":"ContainerStarted","Data":"c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.442978 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bpzpk" event={"ID":"148f7288-2984-4fd9-8d43-9ee90fb4adaf","Type":"ContainerStarted","Data":"e724e951bea64f5473c522d5c6155d5547dc9aff1c5bce9ca620cff417ecbcdd"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.449046 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.449118 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.450079 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: E1002 10:55:48.456731 4835 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.474432 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.474474 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.474482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.474497 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.474507 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:48Z","lastTransitionTime":"2025-10-02T10:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.483265 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.498920 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.520597 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.541555 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.580665 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.580707 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.580717 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.580735 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.580747 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:48Z","lastTransitionTime":"2025-10-02T10:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.588434 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.626371 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.660703 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.684404 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.684459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.684471 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.684490 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.684500 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:48Z","lastTransitionTime":"2025-10-02T10:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.684911 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.697614 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.712312 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.747589 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.786821 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.786865 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.786874 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.786892 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.786905 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:48Z","lastTransitionTime":"2025-10-02T10:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.788042 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.828281 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.870846 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.889814 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.889859 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.889873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.889893 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.889905 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:48Z","lastTransitionTime":"2025-10-02T10:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.910468 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.949775 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.992156 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:48Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.993262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.993300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.993311 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.993328 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:48 crc kubenswrapper[4835]: I1002 10:55:48.993340 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:48Z","lastTransitionTime":"2025-10-02T10:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.029561 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.070754 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.096317 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.096402 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.096424 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.096461 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.096484 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:49Z","lastTransitionTime":"2025-10-02T10:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.199497 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.199572 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.199593 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.199621 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.199642 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:49Z","lastTransitionTime":"2025-10-02T10:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.302967 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.303017 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.303049 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.303069 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.303080 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:49Z","lastTransitionTime":"2025-10-02T10:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.406845 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.406900 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.406909 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.406929 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.406942 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:49Z","lastTransitionTime":"2025-10-02T10:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.456709 4835 generic.go:334] "Generic (PLEG): container finished" podID="e295ff08-63dc-4638-8fb6-6ee6b07ccaa0" containerID="c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d" exitCode=0 Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.456842 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" event={"ID":"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0","Type":"ContainerDied","Data":"c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.458791 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bpzpk" event={"ID":"148f7288-2984-4fd9-8d43-9ee90fb4adaf","Type":"ContainerStarted","Data":"a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.481614 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.498512 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.512157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.512236 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.512250 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.512273 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.512286 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:49Z","lastTransitionTime":"2025-10-02T10:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.532836 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.549210 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.569644 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.583420 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.601927 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.615635 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.615975 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.616001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.616008 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.616027 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.616037 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:49Z","lastTransitionTime":"2025-10-02T10:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.629109 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.641639 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.656736 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.671946 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.684326 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.701080 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.716495 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.718479 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.718521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.718531 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.718549 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.718561 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:49Z","lastTransitionTime":"2025-10-02T10:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.727584 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.762952 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.791687 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.821158 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.821206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.821237 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.821263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.821281 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:49Z","lastTransitionTime":"2025-10-02T10:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.835290 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.871838 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.914090 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.924288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.924342 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.924360 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.924387 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.924408 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:49Z","lastTransitionTime":"2025-10-02T10:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:49 crc kubenswrapper[4835]: I1002 10:55:49.948952 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.022790 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.027066 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.027129 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.027146 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.027170 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.027186 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:50Z","lastTransitionTime":"2025-10-02T10:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.049170 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.069018 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.111708 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.130259 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.130312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.130323 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.130347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.130363 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:50Z","lastTransitionTime":"2025-10-02T10:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.153335 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.188975 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.233387 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.233425 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.233434 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.233452 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.233464 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:50Z","lastTransitionTime":"2025-10-02T10:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.251723 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:50 crc kubenswrapper[4835]: E1002 10:55:50.251849 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.251724 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:50 crc kubenswrapper[4835]: E1002 10:55:50.252007 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.252038 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:50 crc kubenswrapper[4835]: E1002 10:55:50.252334 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.335738 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.335782 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.335794 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.335811 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.335821 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:50Z","lastTransitionTime":"2025-10-02T10:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.439034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.439099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.439113 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.439138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.439153 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:50Z","lastTransitionTime":"2025-10-02T10:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.465822 4835 generic.go:334] "Generic (PLEG): container finished" podID="e295ff08-63dc-4638-8fb6-6ee6b07ccaa0" containerID="8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca" exitCode=0 Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.465894 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" event={"ID":"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0","Type":"ContainerDied","Data":"8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca"} Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.472107 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e"} Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.489780 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.507895 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.518422 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.531208 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.541891 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.541938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.541953 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.541978 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.541992 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:50Z","lastTransitionTime":"2025-10-02T10:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.543969 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.556802 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.568745 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.583263 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.601147 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.619205 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.632987 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.644844 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.644910 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.644927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.644953 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.644972 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:50Z","lastTransitionTime":"2025-10-02T10:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.670695 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.711912 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.747833 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.747886 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.747898 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.748347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.748378 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:50Z","lastTransitionTime":"2025-10-02T10:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.756658 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.852296 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.852360 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.852373 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.852403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.852417 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:50Z","lastTransitionTime":"2025-10-02T10:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.955523 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.955583 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.955604 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.955633 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:50 crc kubenswrapper[4835]: I1002 10:55:50.955653 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:50Z","lastTransitionTime":"2025-10-02T10:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.058582 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.059066 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.059078 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.059097 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.059112 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:51Z","lastTransitionTime":"2025-10-02T10:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.164117 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.164156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.164169 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.164188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.164203 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:51Z","lastTransitionTime":"2025-10-02T10:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.266979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.267054 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.267072 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.267098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.267122 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:51Z","lastTransitionTime":"2025-10-02T10:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.371424 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.371495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.371511 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.371536 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.371560 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:51Z","lastTransitionTime":"2025-10-02T10:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.474537 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.474583 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.474595 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.474616 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.474629 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:51Z","lastTransitionTime":"2025-10-02T10:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.484546 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.485164 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.485318 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.501180 4835 generic.go:334] "Generic (PLEG): container finished" podID="e295ff08-63dc-4638-8fb6-6ee6b07ccaa0" containerID="59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd" exitCode=0 Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.501288 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" event={"ID":"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0","Type":"ContainerDied","Data":"59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.512547 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.528188 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.534068 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.542819 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.559124 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.572397 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.577714 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.577766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.577780 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.577801 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.577815 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:51Z","lastTransitionTime":"2025-10-02T10:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.584669 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.598633 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.613179 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.627043 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.640520 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.655633 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.669969 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.680619 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.680669 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.680682 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.680702 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.680716 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:51Z","lastTransitionTime":"2025-10-02T10:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.685975 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.706190 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.726744 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.740871 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.760460 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.773425 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.783175 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.783250 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.783263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.783288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.783302 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:51Z","lastTransitionTime":"2025-10-02T10:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.798942 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.811809 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.830878 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.847934 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.869735 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.881948 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.886732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.886840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.886853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.886872 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.886885 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:51Z","lastTransitionTime":"2025-10-02T10:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.893861 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.894012 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.894044 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.894069 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:55:59.894040821 +0000 UTC m=+36.453948402 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.894167 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.894233 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.894240 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:59.894206686 +0000 UTC m=+36.454114267 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.894298 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:59.894284818 +0000 UTC m=+36.454192619 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.906428 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.921473 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.937256 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.952062 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.985050 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.989892 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.989935 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.989946 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.989963 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.989974 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:51Z","lastTransitionTime":"2025-10-02T10:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.995558 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:51 crc kubenswrapper[4835]: I1002 10:55:51.995637 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.995841 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.995874 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.995893 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.995973 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:59.995949851 +0000 UTC m=+36.555857442 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.996084 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.996134 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.996153 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:51 crc kubenswrapper[4835]: E1002 10:55:51.996276 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:55:59.996243449 +0000 UTC m=+36.556151040 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.092673 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.092708 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.092718 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.092738 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.092752 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:52Z","lastTransitionTime":"2025-10-02T10:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.195672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.195724 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.195735 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.195754 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.195764 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:52Z","lastTransitionTime":"2025-10-02T10:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.251852 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.251972 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:52 crc kubenswrapper[4835]: E1002 10:55:52.252080 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.252261 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:52 crc kubenswrapper[4835]: E1002 10:55:52.252429 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:55:52 crc kubenswrapper[4835]: E1002 10:55:52.252615 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.298892 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.298928 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.298941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.298960 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.298971 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:52Z","lastTransitionTime":"2025-10-02T10:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.401350 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.401406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.401417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.401450 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.401463 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:52Z","lastTransitionTime":"2025-10-02T10:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.504066 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.504091 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.504099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.504129 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.504141 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:52Z","lastTransitionTime":"2025-10-02T10:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.509144 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.509956 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" event={"ID":"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0","Type":"ContainerStarted","Data":"fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1"} Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.607040 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.607089 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.607099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.607121 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.607135 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:52Z","lastTransitionTime":"2025-10-02T10:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.711098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.711135 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.711144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.711158 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.711168 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:52Z","lastTransitionTime":"2025-10-02T10:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.814172 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.814236 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.814248 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.814265 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.814276 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:52Z","lastTransitionTime":"2025-10-02T10:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.918358 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.918406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.918417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.918435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:52 crc kubenswrapper[4835]: I1002 10:55:52.918448 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:52Z","lastTransitionTime":"2025-10-02T10:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.022340 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.022396 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.022411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.022434 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.022482 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:53Z","lastTransitionTime":"2025-10-02T10:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.129745 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.129857 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.129872 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.129896 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.129913 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:53Z","lastTransitionTime":"2025-10-02T10:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.234506 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.234564 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.234575 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.234598 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.234639 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:53Z","lastTransitionTime":"2025-10-02T10:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.338489 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.338769 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.338785 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.338804 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.338819 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:53Z","lastTransitionTime":"2025-10-02T10:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.441858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.441923 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.441936 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.441979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.441995 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:53Z","lastTransitionTime":"2025-10-02T10:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.512437 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.528097 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.540644 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.544980 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.545056 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.545074 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.545099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.545137 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:53Z","lastTransitionTime":"2025-10-02T10:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.560791 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.581186 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.598620 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.619617 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.636955 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.648672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.648727 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.648739 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.648760 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.648772 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:53Z","lastTransitionTime":"2025-10-02T10:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.652848 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.671039 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.696193 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.721827 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.744661 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.751491 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.751539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.751553 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.751577 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.751590 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:53Z","lastTransitionTime":"2025-10-02T10:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.765900 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.795158 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.855886 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.855943 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.855955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.855982 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.855997 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:53Z","lastTransitionTime":"2025-10-02T10:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.959133 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.959193 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.959205 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.959296 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:53 crc kubenswrapper[4835]: I1002 10:55:53.959333 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:53Z","lastTransitionTime":"2025-10-02T10:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.062990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.063063 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.063081 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.063110 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.063130 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.165926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.166044 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.166064 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.166102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.166123 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.251622 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.251656 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.251808 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:54 crc kubenswrapper[4835]: E1002 10:55:54.251882 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:55:54 crc kubenswrapper[4835]: E1002 10:55:54.252060 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:55:54 crc kubenswrapper[4835]: E1002 10:55:54.252269 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.271092 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.271128 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.271140 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.271157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.271170 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.276080 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.297497 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.321093 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.348597 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.371031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.371080 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.371097 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.371124 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.371144 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.370608 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.387568 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:55:54 crc kubenswrapper[4835]: E1002 10:55:54.387735 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.388447 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.388602 4835 scope.go:117] "RemoveContainer" containerID="aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef" Oct 02 10:55:54 crc kubenswrapper[4835]: E1002 10:55:54.388861 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.393037 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.393091 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.393111 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.393210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.393256 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.405149 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: E1002 10:55:54.407458 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.411771 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.411817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.411834 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.411860 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.411879 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.420205 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: E1002 10:55:54.425303 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.429169 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.429302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.429379 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.429457 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.429524 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.440211 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: E1002 10:55:54.443241 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.448041 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.448188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.448337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.448415 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.448486 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.454141 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: E1002 10:55:54.465629 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: E1002 10:55:54.465775 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.468063 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.468202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.468347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.473013 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.473037 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.474133 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.490721 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.503623 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.522332 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.524096 4835 generic.go:334] "Generic (PLEG): container finished" podID="e295ff08-63dc-4638-8fb6-6ee6b07ccaa0" containerID="fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1" exitCode=0 Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.524184 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" event={"ID":"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0","Type":"ContainerDied","Data":"fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.541004 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.560531 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.575493 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.575524 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.575558 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.575593 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.575613 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.584137 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.596008 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.608860 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.644891 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.660611 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.676757 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.678870 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.678902 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.678917 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.678938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.678950 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.692180 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.707916 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.722983 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.734762 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.752758 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.768454 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.782240 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.782641 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.782653 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.782675 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.782687 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.885598 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.885649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.885661 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.885687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.885700 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.989198 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.989250 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.989262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.989286 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:54 crc kubenswrapper[4835]: I1002 10:55:54.989297 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:54Z","lastTransitionTime":"2025-10-02T10:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.092061 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.092122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.092137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.092160 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.092179 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:55Z","lastTransitionTime":"2025-10-02T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.195772 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.195832 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.195844 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.195869 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.195883 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:55Z","lastTransitionTime":"2025-10-02T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.299038 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.299085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.299096 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.299116 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.299130 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:55Z","lastTransitionTime":"2025-10-02T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.401763 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.401832 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.401844 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.401887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.401903 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:55Z","lastTransitionTime":"2025-10-02T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.504193 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.504266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.504279 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.504298 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.504313 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:55Z","lastTransitionTime":"2025-10-02T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.533568 4835 generic.go:334] "Generic (PLEG): container finished" podID="e295ff08-63dc-4638-8fb6-6ee6b07ccaa0" containerID="f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5" exitCode=0 Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.533687 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" event={"ID":"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0","Type":"ContainerDied","Data":"f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5"} Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.549871 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.562953 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.580101 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.605210 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.609933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.609991 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.610005 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.610031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.610047 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:55Z","lastTransitionTime":"2025-10-02T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.637825 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.654017 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.669150 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.681054 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.724685 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.727021 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.727072 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.727083 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.727105 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.727117 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:55Z","lastTransitionTime":"2025-10-02T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.739095 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.752191 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.770425 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.786437 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.807405 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.829954 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.829999 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.830010 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.830028 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.830040 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:55Z","lastTransitionTime":"2025-10-02T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.932672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.932725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.932738 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.932761 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:55 crc kubenswrapper[4835]: I1002 10:55:55.932775 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:55Z","lastTransitionTime":"2025-10-02T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.035260 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.035298 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.035307 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.035326 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.035338 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:56Z","lastTransitionTime":"2025-10-02T10:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.138578 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.138623 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.138633 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.138652 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.138662 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:56Z","lastTransitionTime":"2025-10-02T10:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.242077 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.242125 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.242136 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.242156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.242166 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:56Z","lastTransitionTime":"2025-10-02T10:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.251698 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.251744 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.251783 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:56 crc kubenswrapper[4835]: E1002 10:55:56.251819 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:55:56 crc kubenswrapper[4835]: E1002 10:55:56.251868 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:55:56 crc kubenswrapper[4835]: E1002 10:55:56.251982 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.345969 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.346019 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.346033 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.346058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.346078 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:56Z","lastTransitionTime":"2025-10-02T10:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.449298 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.449339 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.449350 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.449367 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.449377 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:56Z","lastTransitionTime":"2025-10-02T10:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.541096 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" event={"ID":"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0","Type":"ContainerStarted","Data":"30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.546183 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/0.log" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.552330 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.552393 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.552403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.552427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.552440 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:56Z","lastTransitionTime":"2025-10-02T10:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.556651 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c" exitCode=1 Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.556720 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.557926 4835 scope.go:117] "RemoveContainer" containerID="cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.561260 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.583492 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.599415 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.623408 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.646458 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.655694 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.655725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.655737 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.655756 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.655771 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:56Z","lastTransitionTime":"2025-10-02T10:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.661108 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.679474 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.694629 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.708532 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.727258 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.752986 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.766631 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.766776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.766868 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.766888 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.766911 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:56Z","lastTransitionTime":"2025-10-02T10:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.769248 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.783392 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.796243 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.812301 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.826400 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.841443 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.863764 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.869640 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.869699 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.869720 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.869832 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.869854 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:56Z","lastTransitionTime":"2025-10-02T10:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.891296 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:55:56.432692 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433081 6044 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:55:56.433712 6044 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433857 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.434608 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:55:56.434653 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:55:56.434695 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 10:55:56.434696 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:55:56.434712 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 10:55:56.434769 6044 factory.go:656] Stopping watch factory\\\\nI1002 10:55:56.434783 6044 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.909464 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.925691 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.940509 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.957660 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.973068 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.973124 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.973134 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.973159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.973177 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:56Z","lastTransitionTime":"2025-10-02T10:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.973498 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:56 crc kubenswrapper[4835]: I1002 10:55:56.992409 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.011816 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.027960 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.045832 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.076662 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.076730 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.076743 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.076766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.076813 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:57Z","lastTransitionTime":"2025-10-02T10:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.180507 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.180970 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.181195 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.181468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.181682 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:57Z","lastTransitionTime":"2025-10-02T10:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.284852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.284944 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.284970 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.285001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.285025 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:57Z","lastTransitionTime":"2025-10-02T10:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.387378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.387873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.388042 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.388280 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.388509 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:57Z","lastTransitionTime":"2025-10-02T10:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.491263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.491326 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.491338 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.491380 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.491394 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:57Z","lastTransitionTime":"2025-10-02T10:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.561541 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/0.log" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.564091 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd"} Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.564265 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.584400 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.594674 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.594728 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.594737 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.594759 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.594769 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:57Z","lastTransitionTime":"2025-10-02T10:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.597331 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.610751 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.628037 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.650668 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:55:56.432692 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433081 6044 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:55:56.433712 6044 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433857 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.434608 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:55:56.434653 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:55:56.434695 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 10:55:56.434696 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:55:56.434712 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 10:55:56.434769 6044 factory.go:656] Stopping watch factory\\\\nI1002 10:55:56.434783 6044 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.665463 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.681983 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.697861 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.697808 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.697904 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.697916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.697934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.697967 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:57Z","lastTransitionTime":"2025-10-02T10:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.709616 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.725793 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.751774 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.767189 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.781399 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.791542 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.801023 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.801065 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.801075 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.801093 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.801105 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:57Z","lastTransitionTime":"2025-10-02T10:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.904214 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.904288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.904302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.904332 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:57 crc kubenswrapper[4835]: I1002 10:55:57.904345 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:57Z","lastTransitionTime":"2025-10-02T10:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.007048 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.007099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.007108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.007140 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.007151 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:58Z","lastTransitionTime":"2025-10-02T10:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.023985 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9"] Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.024927 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.030001 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.036953 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.044840 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.057539 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.070192 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.070272 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.070350 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.070372 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shq7z\" (UniqueName: \"kubernetes.io/projected/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-kube-api-access-shq7z\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.071871 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.088539 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.109866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.109916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.109927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.109947 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.109958 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:58Z","lastTransitionTime":"2025-10-02T10:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.112793 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:55:56.432692 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433081 6044 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:55:56.433712 6044 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433857 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.434608 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:55:56.434653 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:55:56.434695 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 10:55:56.434696 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:55:56.434712 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 10:55:56.434769 6044 factory.go:656] Stopping watch factory\\\\nI1002 10:55:56.434783 6044 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.129354 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.144041 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.160062 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.171799 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.171848 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shq7z\" (UniqueName: \"kubernetes.io/projected/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-kube-api-access-shq7z\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.171877 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.171910 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.172900 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.172971 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.174477 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.180192 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.188034 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.191881 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shq7z\" (UniqueName: \"kubernetes.io/projected/f98828b9-0b27-4632-bfd1-d494cb8dfcfb-kube-api-access-shq7z\") pod \"ovnkube-control-plane-749d76644c-gm7l9\" (UID: \"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.203755 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.214835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.214890 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.214906 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.214930 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.214948 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:58Z","lastTransitionTime":"2025-10-02T10:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.218004 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.251720 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:55:58 crc kubenswrapper[4835]: E1002 10:55:58.251871 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.252333 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:58 crc kubenswrapper[4835]: E1002 10:55:58.252404 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.252438 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:55:58 crc kubenswrapper[4835]: E1002 10:55:58.252492 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.263544 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.286063 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.306064 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.318185 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.318247 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.318261 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.318280 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.318292 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:58Z","lastTransitionTime":"2025-10-02T10:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.349853 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.421753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.421794 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.421804 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.421822 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.421836 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:58Z","lastTransitionTime":"2025-10-02T10:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.525615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.525675 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.525689 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.525715 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.525731 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:58Z","lastTransitionTime":"2025-10-02T10:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.590209 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/1.log" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.591759 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/0.log" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.595667 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd" exitCode=1 Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.595929 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.596025 4835 scope.go:117] "RemoveContainer" containerID="cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.596895 4835 scope.go:117] "RemoveContainer" containerID="6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd" Oct 02 10:55:58 crc kubenswrapper[4835]: E1002 10:55:58.597068 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.597059 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" event={"ID":"f98828b9-0b27-4632-bfd1-d494cb8dfcfb","Type":"ContainerStarted","Data":"131525c58b14acfa762bda360d37c90c1afbdeae3c0ee4b9ddc693f15d16bab0"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.615719 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.629244 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.629469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.629502 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.629512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.629531 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.629555 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:58Z","lastTransitionTime":"2025-10-02T10:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.650997 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.668250 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.691870 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:55:56.432692 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433081 6044 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:55:56.433712 6044 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433857 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.434608 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:55:56.434653 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:55:56.434695 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 10:55:56.434696 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:55:56.434712 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 10:55:56.434769 6044 factory.go:656] Stopping watch factory\\\\nI1002 10:55:56.434783 6044 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"3] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:55:57.716122 6273 model_cli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.708240 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.723995 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.733582 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.733630 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.733645 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.733669 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.733770 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:58Z","lastTransitionTime":"2025-10-02T10:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.741348 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.756168 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.768406 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.781263 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.797647 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.813553 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.828130 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.837702 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.837914 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.837999 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.838095 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.838178 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:58Z","lastTransitionTime":"2025-10-02T10:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.843133 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:58Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.942079 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.942192 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.942218 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.942305 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:58 crc kubenswrapper[4835]: I1002 10:55:58.942331 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:58Z","lastTransitionTime":"2025-10-02T10:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.045916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.045958 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.045968 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.045987 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.045998 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:59Z","lastTransitionTime":"2025-10-02T10:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.150016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.150058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.150069 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.150086 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.150099 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:59Z","lastTransitionTime":"2025-10-02T10:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.254048 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.254119 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.254137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.254167 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.254189 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:59Z","lastTransitionTime":"2025-10-02T10:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.361605 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.361649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.361659 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.361680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.361692 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:59Z","lastTransitionTime":"2025-10-02T10:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.463969 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.464058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.464087 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.464107 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.464118 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:59Z","lastTransitionTime":"2025-10-02T10:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.502612 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5j5j6"] Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.503529 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:55:59 crc kubenswrapper[4835]: E1002 10:55:59.503666 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.527655 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.547157 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.567380 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.567422 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.567448 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.567466 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.567475 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:59Z","lastTransitionTime":"2025-10-02T10:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.569179 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.583097 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.596485 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.597706 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2s4s\" (UniqueName: \"kubernetes.io/projected/7fddaac1-5041-411a-8aed-e7337c06713f-kube-api-access-q2s4s\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.597759 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.610016 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/1.log" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.616692 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" event={"ID":"f98828b9-0b27-4632-bfd1-d494cb8dfcfb","Type":"ContainerStarted","Data":"dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.616891 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" event={"ID":"f98828b9-0b27-4632-bfd1-d494cb8dfcfb","Type":"ContainerStarted","Data":"b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.617134 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.630333 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.643954 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.654815 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.670162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.670197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.670207 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.670242 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.670252 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:59Z","lastTransitionTime":"2025-10-02T10:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.672146 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.687276 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.699067 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2s4s\" (UniqueName: \"kubernetes.io/projected/7fddaac1-5041-411a-8aed-e7337c06713f-kube-api-access-q2s4s\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.699161 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:55:59 crc kubenswrapper[4835]: E1002 10:55:59.699627 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:55:59 crc kubenswrapper[4835]: E1002 10:55:59.699770 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs podName:7fddaac1-5041-411a-8aed-e7337c06713f nodeName:}" failed. No retries permitted until 2025-10-02 10:56:00.19973523 +0000 UTC m=+36.759642851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs") pod "network-metrics-daemon-5j5j6" (UID: "7fddaac1-5041-411a-8aed-e7337c06713f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.701697 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.718773 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.729348 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2s4s\" (UniqueName: \"kubernetes.io/projected/7fddaac1-5041-411a-8aed-e7337c06713f-kube-api-access-q2s4s\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.741335 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:55:56.432692 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433081 6044 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:55:56.433712 6044 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433857 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.434608 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:55:56.434653 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:55:56.434695 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 10:55:56.434696 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:55:56.434712 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 10:55:56.434769 6044 factory.go:656] Stopping watch factory\\\\nI1002 10:55:56.434783 6044 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"3] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:55:57.716122 6273 model_cli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.753214 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.769339 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.773747 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.773799 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.773813 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.773835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.773849 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:59Z","lastTransitionTime":"2025-10-02T10:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.789938 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.812171 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.828597 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.843803 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.857762 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.876758 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.876815 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.876830 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.876862 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.876879 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:59Z","lastTransitionTime":"2025-10-02T10:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.876871 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.895213 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.901568 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:55:59 crc kubenswrapper[4835]: E1002 10:55:59.901739 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:56:15.901704757 +0000 UTC m=+52.461612368 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.901891 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.901942 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:55:59 crc kubenswrapper[4835]: E1002 10:55:59.902051 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:59 crc kubenswrapper[4835]: E1002 10:55:59.902096 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:59 crc kubenswrapper[4835]: E1002 10:55:59.902127 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:56:15.902109019 +0000 UTC m=+52.462016640 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:55:59 crc kubenswrapper[4835]: E1002 10:55:59.902156 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:56:15.90214104 +0000 UTC m=+52.462048661 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.908590 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.918278 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.934321 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.946108 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.959299 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.974476 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.979598 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.979639 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.979652 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.979679 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.979696 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:55:59Z","lastTransitionTime":"2025-10-02T10:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:55:59 crc kubenswrapper[4835]: I1002 10:55:59.996243 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2af07c480b24a78921d1f603c6c34c7fadd1390b6a9dee50fff8e046e7c65c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:55:56.432692 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433081 6044 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:55:56.433712 6044 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.433857 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:55:56.434608 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:55:56.434653 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:55:56.434695 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 10:55:56.434696 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:55:56.434712 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 10:55:56.434769 6044 factory.go:656] Stopping watch factory\\\\nI1002 10:55:56.434783 6044 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"3] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:55:57.716122 6273 model_cli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:55:59Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.002981 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.003059 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.003237 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.003269 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.003260 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.003316 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.003336 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.003285 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.003425 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:56:16.003395931 +0000 UTC m=+52.563303512 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.003508 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:56:16.003484123 +0000 UTC m=+52.563391824 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.011125 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.024325 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.083037 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.083084 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.083095 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.083114 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.083125 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:00Z","lastTransitionTime":"2025-10-02T10:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.186849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.186891 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.186904 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.186923 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.186938 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:00Z","lastTransitionTime":"2025-10-02T10:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.205975 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.206179 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.206286 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs podName:7fddaac1-5041-411a-8aed-e7337c06713f nodeName:}" failed. No retries permitted until 2025-10-02 10:56:01.206258623 +0000 UTC m=+37.766166224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs") pod "network-metrics-daemon-5j5j6" (UID: "7fddaac1-5041-411a-8aed-e7337c06713f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.251759 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.251770 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.252018 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.252100 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.252210 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.252368 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.291817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.291873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.291883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.291907 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.291923 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:00Z","lastTransitionTime":"2025-10-02T10:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.395639 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.395698 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.395715 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.395740 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.395762 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:00Z","lastTransitionTime":"2025-10-02T10:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.499798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.499866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.499890 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.499924 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.499952 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:00Z","lastTransitionTime":"2025-10-02T10:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.603263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.603311 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.603322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.603344 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.603355 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:00Z","lastTransitionTime":"2025-10-02T10:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.706202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.706274 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.706286 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.706321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.706334 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:00Z","lastTransitionTime":"2025-10-02T10:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.713899 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.714831 4835 scope.go:117] "RemoveContainer" containerID="6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd" Oct 02 10:56:00 crc kubenswrapper[4835]: E1002 10:56:00.715022 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.730603 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.746112 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.765350 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.780749 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.795299 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.809818 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.809891 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.809912 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.809941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.809991 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:00Z","lastTransitionTime":"2025-10-02T10:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.816675 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.834140 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.847278 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.867910 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.887437 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.908028 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.913531 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.913610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.913667 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.913699 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.913721 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:00Z","lastTransitionTime":"2025-10-02T10:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.928721 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.954463 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"3] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:55:57.716122 6273 model_cli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.973485 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:00 crc kubenswrapper[4835]: I1002 10:56:00.997864 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.017409 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.017490 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.017509 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.017539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.017558 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:01Z","lastTransitionTime":"2025-10-02T10:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.024008 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.120797 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.120848 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.120860 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.120880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.120891 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:01Z","lastTransitionTime":"2025-10-02T10:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.218490 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:01 crc kubenswrapper[4835]: E1002 10:56:01.218936 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:01 crc kubenswrapper[4835]: E1002 10:56:01.219171 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs podName:7fddaac1-5041-411a-8aed-e7337c06713f nodeName:}" failed. No retries permitted until 2025-10-02 10:56:03.219130503 +0000 UTC m=+39.779038204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs") pod "network-metrics-daemon-5j5j6" (UID: "7fddaac1-5041-411a-8aed-e7337c06713f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.224847 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.224906 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.224924 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.224966 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.224986 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:01Z","lastTransitionTime":"2025-10-02T10:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.251570 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:01 crc kubenswrapper[4835]: E1002 10:56:01.251844 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.330464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.330515 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.330525 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.330548 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.330561 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:01Z","lastTransitionTime":"2025-10-02T10:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.433990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.434059 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.434072 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.434094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.434113 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:01Z","lastTransitionTime":"2025-10-02T10:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.537831 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.537897 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.537913 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.537936 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.537951 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:01Z","lastTransitionTime":"2025-10-02T10:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.641102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.641612 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.641754 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.641853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.641972 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:01Z","lastTransitionTime":"2025-10-02T10:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.746170 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.746216 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.746239 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.746256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.746267 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:01Z","lastTransitionTime":"2025-10-02T10:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.849096 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.849138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.849149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.849174 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.849189 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:01Z","lastTransitionTime":"2025-10-02T10:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.952946 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.953049 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.953079 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.953123 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:01 crc kubenswrapper[4835]: I1002 10:56:01.953169 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:01Z","lastTransitionTime":"2025-10-02T10:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.055956 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.056014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.056026 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.056047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.056061 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:02Z","lastTransitionTime":"2025-10-02T10:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.159948 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.160037 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.160056 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.160084 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.160103 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:02Z","lastTransitionTime":"2025-10-02T10:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.251778 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.251877 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.252038 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:02 crc kubenswrapper[4835]: E1002 10:56:02.252046 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:02 crc kubenswrapper[4835]: E1002 10:56:02.252340 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:02 crc kubenswrapper[4835]: E1002 10:56:02.252568 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.262070 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.262129 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.262149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.262173 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.262187 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:02Z","lastTransitionTime":"2025-10-02T10:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.365765 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.365840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.365864 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.365900 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.365922 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:02Z","lastTransitionTime":"2025-10-02T10:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.469593 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.469657 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.469667 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.469692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.469704 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:02Z","lastTransitionTime":"2025-10-02T10:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.572681 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.572753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.572770 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.572795 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.572813 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:02Z","lastTransitionTime":"2025-10-02T10:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.675819 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.675890 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.675964 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.675995 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.676013 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:02Z","lastTransitionTime":"2025-10-02T10:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.779610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.779674 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.779690 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.779714 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.779729 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:02Z","lastTransitionTime":"2025-10-02T10:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.884759 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.885089 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.885159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.885242 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.885316 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:02Z","lastTransitionTime":"2025-10-02T10:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.989075 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.989139 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.989156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.989183 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:02 crc kubenswrapper[4835]: I1002 10:56:02.989201 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:02Z","lastTransitionTime":"2025-10-02T10:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.092762 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.092849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.092877 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.092907 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.092928 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:03Z","lastTransitionTime":"2025-10-02T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.195590 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.195669 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.195693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.195724 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.195749 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:03Z","lastTransitionTime":"2025-10-02T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.241851 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:03 crc kubenswrapper[4835]: E1002 10:56:03.242097 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:03 crc kubenswrapper[4835]: E1002 10:56:03.242168 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs podName:7fddaac1-5041-411a-8aed-e7337c06713f nodeName:}" failed. No retries permitted until 2025-10-02 10:56:07.242145124 +0000 UTC m=+43.802052735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs") pod "network-metrics-daemon-5j5j6" (UID: "7fddaac1-5041-411a-8aed-e7337c06713f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.250795 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:03 crc kubenswrapper[4835]: E1002 10:56:03.251005 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.298818 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.298871 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.298884 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.298905 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.298919 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:03Z","lastTransitionTime":"2025-10-02T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.401936 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.402010 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.402033 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.402065 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.402091 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:03Z","lastTransitionTime":"2025-10-02T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.505803 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.505876 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.505897 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.505921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.505940 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:03Z","lastTransitionTime":"2025-10-02T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.609533 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.609590 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.609603 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.609629 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.609644 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:03Z","lastTransitionTime":"2025-10-02T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.716562 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.716628 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.716659 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.716690 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.716712 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:03Z","lastTransitionTime":"2025-10-02T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.820446 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.820517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.820535 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.820562 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.820582 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:03Z","lastTransitionTime":"2025-10-02T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.923866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.923945 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.923963 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.923993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:03 crc kubenswrapper[4835]: I1002 10:56:03.924014 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:03Z","lastTransitionTime":"2025-10-02T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.028134 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.028195 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.028212 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.028275 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.028299 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.131726 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.131787 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.131810 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.131840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.131859 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.235615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.235661 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.235672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.235692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.235707 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.251217 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.251360 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:04 crc kubenswrapper[4835]: E1002 10:56:04.251488 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.251289 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:04 crc kubenswrapper[4835]: E1002 10:56:04.251644 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:04 crc kubenswrapper[4835]: E1002 10:56:04.251876 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.274846 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.298485 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.316938 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.335023 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.338479 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.338546 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.338567 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.338597 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.338633 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.352906 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.373667 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.391836 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.406243 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.422605 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.438368 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.441685 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.441850 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.441874 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.441907 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.441926 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.456548 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.478209 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.504365 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.508615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.508662 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.508681 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.508707 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.508729 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: E1002 10:56:04.527731 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.532731 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.532829 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.532851 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.532884 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.532915 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.541326 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"3] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:55:57.716122 6273 model_cli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: E1002 10:56:04.555174 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.555640 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.560647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.560714 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.560733 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.560762 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.560782 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.571123 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: E1002 10:56:04.580583 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.585596 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.585630 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.585640 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.585673 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.585684 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: E1002 10:56:04.603331 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.609815 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.609859 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.609871 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.609934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.609949 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: E1002 10:56:04.629902 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:04 crc kubenswrapper[4835]: E1002 10:56:04.630072 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.631966 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.632002 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.632015 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.632038 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.632052 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.734705 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.734753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.734764 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.734788 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.734801 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.837442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.837490 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.837499 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.837517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.837528 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.941171 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.941257 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.941272 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.941294 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:04 crc kubenswrapper[4835]: I1002 10:56:04.941311 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:04Z","lastTransitionTime":"2025-10-02T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.044567 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.044626 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.044639 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.044659 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.044674 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:05Z","lastTransitionTime":"2025-10-02T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.147713 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.147776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.147785 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.147820 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.147833 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:05Z","lastTransitionTime":"2025-10-02T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.250495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.250560 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.250586 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.250633 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.250655 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:05Z","lastTransitionTime":"2025-10-02T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.250869 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:05 crc kubenswrapper[4835]: E1002 10:56:05.251071 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.353796 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.353859 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.353880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.353902 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.353918 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:05Z","lastTransitionTime":"2025-10-02T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.457659 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.457753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.457769 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.457789 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.458286 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:05Z","lastTransitionTime":"2025-10-02T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.561837 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.561880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.561892 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.561910 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.561920 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:05Z","lastTransitionTime":"2025-10-02T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.664577 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.664631 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.664647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.664672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.664690 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:05Z","lastTransitionTime":"2025-10-02T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.767346 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.767388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.767401 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.767420 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.767434 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:05Z","lastTransitionTime":"2025-10-02T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.870975 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.871309 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.871408 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.871860 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.871971 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:05Z","lastTransitionTime":"2025-10-02T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.976200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.976614 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.976736 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.976872 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:05 crc kubenswrapper[4835]: I1002 10:56:05.976978 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:05Z","lastTransitionTime":"2025-10-02T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.080187 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.080234 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.080245 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.080263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.080275 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:06Z","lastTransitionTime":"2025-10-02T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.183364 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.183395 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.183404 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.183418 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.183428 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:06Z","lastTransitionTime":"2025-10-02T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.250938 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:06 crc kubenswrapper[4835]: E1002 10:56:06.251033 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.251293 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:06 crc kubenswrapper[4835]: E1002 10:56:06.251366 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.251378 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:06 crc kubenswrapper[4835]: E1002 10:56:06.251440 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.286883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.287021 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.287035 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.287506 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.287526 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:06Z","lastTransitionTime":"2025-10-02T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.390166 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.390232 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.390245 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.390266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.390281 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:06Z","lastTransitionTime":"2025-10-02T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.492839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.492904 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.492922 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.492950 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.492969 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:06Z","lastTransitionTime":"2025-10-02T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.596502 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.596554 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.596568 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.596591 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.596607 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:06Z","lastTransitionTime":"2025-10-02T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.699592 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.699658 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.699678 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.699706 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.699724 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:06Z","lastTransitionTime":"2025-10-02T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.802949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.803022 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.803037 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.803062 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.803076 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:06Z","lastTransitionTime":"2025-10-02T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.906102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.906513 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.906660 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.906857 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:06 crc kubenswrapper[4835]: I1002 10:56:06.907010 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:06Z","lastTransitionTime":"2025-10-02T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.009820 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.009861 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.009869 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.009886 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.009898 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:07Z","lastTransitionTime":"2025-10-02T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.113323 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.113377 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.113390 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.113425 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.113455 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:07Z","lastTransitionTime":"2025-10-02T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.217250 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.217301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.217314 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.217332 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.217344 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:07Z","lastTransitionTime":"2025-10-02T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.251159 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:07 crc kubenswrapper[4835]: E1002 10:56:07.251483 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.290993 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:07 crc kubenswrapper[4835]: E1002 10:56:07.291416 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:07 crc kubenswrapper[4835]: E1002 10:56:07.291587 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs podName:7fddaac1-5041-411a-8aed-e7337c06713f nodeName:}" failed. No retries permitted until 2025-10-02 10:56:15.291546823 +0000 UTC m=+51.851454434 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs") pod "network-metrics-daemon-5j5j6" (UID: "7fddaac1-5041-411a-8aed-e7337c06713f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.320374 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.320441 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.320464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.320495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.320519 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:07Z","lastTransitionTime":"2025-10-02T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.423498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.423569 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.423595 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.423624 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.423644 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:07Z","lastTransitionTime":"2025-10-02T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.526677 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.526718 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.526732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.526750 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.526761 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:07Z","lastTransitionTime":"2025-10-02T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.630103 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.630166 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.630188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.630256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.630282 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:07Z","lastTransitionTime":"2025-10-02T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.733875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.733928 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.733938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.733959 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.733971 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:07Z","lastTransitionTime":"2025-10-02T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.836488 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.836538 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.836551 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.836570 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.836581 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:07Z","lastTransitionTime":"2025-10-02T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.939563 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.939617 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.939627 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.939650 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:07 crc kubenswrapper[4835]: I1002 10:56:07.939662 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:07Z","lastTransitionTime":"2025-10-02T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.043083 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.043132 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.043148 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.043169 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.043212 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:08Z","lastTransitionTime":"2025-10-02T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.146418 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.146474 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.146482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.146500 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.146510 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:08Z","lastTransitionTime":"2025-10-02T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.248776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.248813 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.248822 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.248840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.248856 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:08Z","lastTransitionTime":"2025-10-02T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.252087 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.252142 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.252183 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:08 crc kubenswrapper[4835]: E1002 10:56:08.252296 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:08 crc kubenswrapper[4835]: E1002 10:56:08.252501 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:08 crc kubenswrapper[4835]: E1002 10:56:08.252569 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.352089 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.352130 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.352139 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.352157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.352170 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:08Z","lastTransitionTime":"2025-10-02T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.454929 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.454975 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.454983 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.455006 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.455019 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:08Z","lastTransitionTime":"2025-10-02T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.558406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.558462 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.558483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.558504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.558516 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:08Z","lastTransitionTime":"2025-10-02T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.661760 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.661827 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.661838 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.661875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.661889 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:08Z","lastTransitionTime":"2025-10-02T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.765640 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.765693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.765703 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.765726 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.765740 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:08Z","lastTransitionTime":"2025-10-02T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.869361 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.869433 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.869456 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.869491 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.869516 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:08Z","lastTransitionTime":"2025-10-02T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.972555 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.972619 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.972660 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.972693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:08 crc kubenswrapper[4835]: I1002 10:56:08.972718 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:08Z","lastTransitionTime":"2025-10-02T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.077911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.078008 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.078026 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.078057 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.078075 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:09Z","lastTransitionTime":"2025-10-02T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.182341 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.182412 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.182430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.182453 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.182474 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:09Z","lastTransitionTime":"2025-10-02T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.251806 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:09 crc kubenswrapper[4835]: E1002 10:56:09.252587 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.252949 4835 scope.go:117] "RemoveContainer" containerID="aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.286416 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.286478 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.286493 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.286521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.286538 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:09Z","lastTransitionTime":"2025-10-02T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.391461 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.391498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.391512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.391533 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.391561 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:09Z","lastTransitionTime":"2025-10-02T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.495745 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.496353 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.496378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.496411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.496435 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:09Z","lastTransitionTime":"2025-10-02T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.600381 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.600475 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.600496 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.600539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.600564 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:09Z","lastTransitionTime":"2025-10-02T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.655962 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.658848 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60"} Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.659577 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.683803 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.699679 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.705525 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.705584 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.705602 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.705630 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.705650 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:09Z","lastTransitionTime":"2025-10-02T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.717783 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.738380 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.758728 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.783624 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"3] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:55:57.716122 6273 model_cli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.797841 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.808512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.808548 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.808559 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.808581 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.808593 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:09Z","lastTransitionTime":"2025-10-02T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.811817 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.836754 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.853855 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.875210 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.891931 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.909082 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.910927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.910979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.910994 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.911015 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.911053 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:09Z","lastTransitionTime":"2025-10-02T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.922688 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.940332 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:09 crc kubenswrapper[4835]: I1002 10:56:09.955208 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:09Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.014203 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.014301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.014315 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.014342 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.014359 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:10Z","lastTransitionTime":"2025-10-02T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.118632 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.118697 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.118716 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.118739 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.118755 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:10Z","lastTransitionTime":"2025-10-02T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.221691 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.221757 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.221774 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.221798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.221814 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:10Z","lastTransitionTime":"2025-10-02T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.251276 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.251393 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.251462 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:10 crc kubenswrapper[4835]: E1002 10:56:10.251600 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:10 crc kubenswrapper[4835]: E1002 10:56:10.251702 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:10 crc kubenswrapper[4835]: E1002 10:56:10.251931 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.324345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.324401 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.324411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.324431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.324443 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:10Z","lastTransitionTime":"2025-10-02T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.427780 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.427832 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.427841 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.427858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.427870 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:10Z","lastTransitionTime":"2025-10-02T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.531094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.531479 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.531571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.531690 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.531781 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:10Z","lastTransitionTime":"2025-10-02T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.634387 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.634439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.634450 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.634472 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.634487 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:10Z","lastTransitionTime":"2025-10-02T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.737121 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.737184 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.737197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.737244 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.737259 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:10Z","lastTransitionTime":"2025-10-02T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.840519 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.840620 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.840654 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.840687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.840709 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:10Z","lastTransitionTime":"2025-10-02T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.943653 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.943716 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.943734 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.943761 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:10 crc kubenswrapper[4835]: I1002 10:56:10.943778 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:10Z","lastTransitionTime":"2025-10-02T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.046294 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.046345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.046358 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.046380 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.046394 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:11Z","lastTransitionTime":"2025-10-02T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.149039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.149146 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.149162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.149186 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.149198 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:11Z","lastTransitionTime":"2025-10-02T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.250808 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:11 crc kubenswrapper[4835]: E1002 10:56:11.251047 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.252075 4835 scope.go:117] "RemoveContainer" containerID="6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.252252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.252331 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.252346 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.252390 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.252408 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:11Z","lastTransitionTime":"2025-10-02T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.361299 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.361347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.361358 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.361377 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.361388 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:11Z","lastTransitionTime":"2025-10-02T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.464625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.464687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.464700 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.464721 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.464734 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:11Z","lastTransitionTime":"2025-10-02T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.568251 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.568288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.568298 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.568313 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.568323 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:11Z","lastTransitionTime":"2025-10-02T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.670118 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/1.log" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.673622 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.673677 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.673688 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.673708 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.673721 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:11Z","lastTransitionTime":"2025-10-02T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.674422 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693"} Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.675259 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.695006 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.723771 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.750606 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.776641 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.776708 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.776728 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.776757 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.776777 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:11Z","lastTransitionTime":"2025-10-02T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.780035 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.793380 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.806266 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.818750 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.833031 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.852244 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.874530 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"3] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:55:57.716122 6273 model_cli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.879408 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.879498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.879515 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.879550 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.879571 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:11Z","lastTransitionTime":"2025-10-02T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.889397 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.904351 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.924065 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.939698 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.956190 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.974285 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:11Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.982491 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.982581 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.982609 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.982670 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:11 crc kubenswrapper[4835]: I1002 10:56:11.982691 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:11Z","lastTransitionTime":"2025-10-02T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.086340 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.086393 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.086406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.086425 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.086438 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:12Z","lastTransitionTime":"2025-10-02T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.189492 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.189561 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.189578 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.189883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.189920 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:12Z","lastTransitionTime":"2025-10-02T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.251343 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:12 crc kubenswrapper[4835]: E1002 10:56:12.251570 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.251659 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.251670 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:12 crc kubenswrapper[4835]: E1002 10:56:12.252029 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:12 crc kubenswrapper[4835]: E1002 10:56:12.252309 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.292964 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.293017 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.293027 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.293050 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.293066 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:12Z","lastTransitionTime":"2025-10-02T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.396519 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.396580 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.396598 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.396626 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.396649 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:12Z","lastTransitionTime":"2025-10-02T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.499509 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.499566 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.499583 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.499609 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.499631 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:12Z","lastTransitionTime":"2025-10-02T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.603439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.603497 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.603512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.603536 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.603554 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:12Z","lastTransitionTime":"2025-10-02T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.680788 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/2.log" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.681578 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/1.log" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.685496 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693" exitCode=1 Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.685547 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693"} Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.685590 4835 scope.go:117] "RemoveContainer" containerID="6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.687004 4835 scope.go:117] "RemoveContainer" containerID="5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693" Oct 02 10:56:12 crc kubenswrapper[4835]: E1002 10:56:12.687326 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.707099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.707138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.707149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.707169 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.707181 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:12Z","lastTransitionTime":"2025-10-02T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.708136 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.726085 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.747052 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.766810 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.785865 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.802088 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.809816 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.809849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.809860 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.809881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.809894 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:12Z","lastTransitionTime":"2025-10-02T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.820687 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.832332 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.843659 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.858168 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.870616 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.882136 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.895073 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.912716 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.912765 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.912780 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.912803 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.912818 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:12Z","lastTransitionTime":"2025-10-02T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.916157 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.946855 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6bf4ef7a42993efac1f4e6c9fb1b2456089b02e8a11832dfe191c2c714bdfbdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"message\\\":\\\"3] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:55:57.716122 6273 model_cli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:12 crc kubenswrapper[4835]: I1002 10:56:12.966446 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:12Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.015941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.016004 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.016025 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.016097 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.016107 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:13Z","lastTransitionTime":"2025-10-02T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.119271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.119347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.119385 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.119416 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.119437 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:13Z","lastTransitionTime":"2025-10-02T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.222639 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.222704 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.222721 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.222746 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.222765 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:13Z","lastTransitionTime":"2025-10-02T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.251036 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:13 crc kubenswrapper[4835]: E1002 10:56:13.251333 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.325436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.325545 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.325562 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.325588 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.325604 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:13Z","lastTransitionTime":"2025-10-02T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.428071 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.428141 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.428158 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.428179 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.428197 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:13Z","lastTransitionTime":"2025-10-02T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.531106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.531163 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.531173 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.531194 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.531205 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:13Z","lastTransitionTime":"2025-10-02T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.633898 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.633947 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.633957 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.633976 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.633987 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:13Z","lastTransitionTime":"2025-10-02T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.691048 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/2.log" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.695821 4835 scope.go:117] "RemoveContainer" containerID="5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693" Oct 02 10:56:13 crc kubenswrapper[4835]: E1002 10:56:13.696049 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.715725 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.735105 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.736966 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.737080 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.738002 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.738047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.738087 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:13Z","lastTransitionTime":"2025-10-02T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.753627 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.769631 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.789569 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.807359 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.823995 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.841117 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.841321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.841369 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.841379 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.841396 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.841424 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:13Z","lastTransitionTime":"2025-10-02T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.863992 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.891633 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.907293 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.924158 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.944509 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.944583 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.944605 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.944636 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.944660 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:13Z","lastTransitionTime":"2025-10-02T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.946345 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.969190 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:13 crc kubenswrapper[4835]: I1002 10:56:13.988317 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:13Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.003271 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.047961 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.048047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.048091 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.048126 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.048149 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.151695 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.151799 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.151824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.151866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.151892 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.251591 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.251687 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:14 crc kubenswrapper[4835]: E1002 10:56:14.251790 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:14 crc kubenswrapper[4835]: E1002 10:56:14.251906 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.252121 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:14 crc kubenswrapper[4835]: E1002 10:56:14.252792 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.256965 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.257034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.257047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.257078 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.257097 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.277668 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.299193 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.318262 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.334008 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.349619 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.360685 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.360731 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.360742 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.360759 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.360771 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.366708 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.388077 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.402488 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.422495 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.443326 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.460310 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.464691 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.464893 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.465087 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.465314 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.465512 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.482902 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.503434 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.523987 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.538865 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.555831 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.567841 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.567905 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.567921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.567955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.567975 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.660328 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.660365 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.660374 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.660392 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.660404 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: E1002 10:56:14.675063 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.680137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.680192 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.680252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.680285 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.680302 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: E1002 10:56:14.699848 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.703690 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.703721 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.703731 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.703747 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.703757 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: E1002 10:56:14.720439 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.725724 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.725792 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.725812 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.725836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.725854 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: E1002 10:56:14.741629 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.746183 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.746244 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.746258 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.746278 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.746292 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: E1002 10:56:14.761949 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:14 crc kubenswrapper[4835]: E1002 10:56:14.762153 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.764016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.764054 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.764070 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.764111 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.764127 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.867648 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.867701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.867710 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.867725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.867735 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.970405 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.970473 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.970492 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.970525 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:14 crc kubenswrapper[4835]: I1002 10:56:14.970548 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:14Z","lastTransitionTime":"2025-10-02T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.074340 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.074395 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.074406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.074425 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.074437 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:15Z","lastTransitionTime":"2025-10-02T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.177646 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.177703 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.177715 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.177739 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.177751 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:15Z","lastTransitionTime":"2025-10-02T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.251656 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:15 crc kubenswrapper[4835]: E1002 10:56:15.251852 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.281089 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.281159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.281180 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.281209 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.281259 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:15Z","lastTransitionTime":"2025-10-02T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.294545 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:15 crc kubenswrapper[4835]: E1002 10:56:15.294814 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:15 crc kubenswrapper[4835]: E1002 10:56:15.294960 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs podName:7fddaac1-5041-411a-8aed-e7337c06713f nodeName:}" failed. No retries permitted until 2025-10-02 10:56:31.294918965 +0000 UTC m=+67.854826706 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs") pod "network-metrics-daemon-5j5j6" (UID: "7fddaac1-5041-411a-8aed-e7337c06713f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.384133 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.384257 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.384285 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.384316 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.384336 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:15Z","lastTransitionTime":"2025-10-02T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.487565 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.487613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.487625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.487647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.487660 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:15Z","lastTransitionTime":"2025-10-02T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.590683 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.590753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.590770 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.590802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.590825 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:15Z","lastTransitionTime":"2025-10-02T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.694045 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.694094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.694108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.694126 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.694140 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:15Z","lastTransitionTime":"2025-10-02T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.797191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.797334 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.797360 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.797390 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.797414 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:15Z","lastTransitionTime":"2025-10-02T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.901694 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.901773 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.901798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.901830 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.901853 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:15Z","lastTransitionTime":"2025-10-02T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.901994 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:56:15 crc kubenswrapper[4835]: E1002 10:56:15.902329 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:56:47.902284806 +0000 UTC m=+84.462192567 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.902429 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:15 crc kubenswrapper[4835]: I1002 10:56:15.902530 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:15 crc kubenswrapper[4835]: E1002 10:56:15.902588 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:56:15 crc kubenswrapper[4835]: E1002 10:56:15.902732 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:56:47.902706678 +0000 UTC m=+84.462614279 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:56:15 crc kubenswrapper[4835]: E1002 10:56:15.902770 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:56:15 crc kubenswrapper[4835]: E1002 10:56:15.902902 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:56:47.902868003 +0000 UTC m=+84.462775584 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.003441 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.003514 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:16 crc kubenswrapper[4835]: E1002 10:56:16.003667 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:56:16 crc kubenswrapper[4835]: E1002 10:56:16.003688 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:56:16 crc kubenswrapper[4835]: E1002 10:56:16.003702 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:16 crc kubenswrapper[4835]: E1002 10:56:16.003760 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:56:48.003741373 +0000 UTC m=+84.563648964 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:16 crc kubenswrapper[4835]: E1002 10:56:16.003763 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:56:16 crc kubenswrapper[4835]: E1002 10:56:16.003823 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:56:16 crc kubenswrapper[4835]: E1002 10:56:16.003844 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:16 crc kubenswrapper[4835]: E1002 10:56:16.003946 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:56:48.003917298 +0000 UTC m=+84.563824879 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.005749 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.005808 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.005823 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.005842 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.005860 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:16Z","lastTransitionTime":"2025-10-02T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.109781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.109835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.109847 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.109869 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.109884 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:16Z","lastTransitionTime":"2025-10-02T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.211998 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.212363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.212377 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.212402 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.212416 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:16Z","lastTransitionTime":"2025-10-02T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.251027 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.251034 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.251049 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:16 crc kubenswrapper[4835]: E1002 10:56:16.251182 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:16 crc kubenswrapper[4835]: E1002 10:56:16.251412 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:16 crc kubenswrapper[4835]: E1002 10:56:16.251471 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.315012 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.315067 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.315082 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.315105 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.315118 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:16Z","lastTransitionTime":"2025-10-02T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.418058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.418127 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.418139 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.418180 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.418195 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:16Z","lastTransitionTime":"2025-10-02T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.522051 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.522119 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.522138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.522168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.522188 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:16Z","lastTransitionTime":"2025-10-02T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.625005 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.625075 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.625093 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.625118 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.625138 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:16Z","lastTransitionTime":"2025-10-02T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.728125 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.728183 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.728200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.728252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.728270 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:16Z","lastTransitionTime":"2025-10-02T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.831122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.831188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.831205 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.831259 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.831279 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:16Z","lastTransitionTime":"2025-10-02T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.934366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.934431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.934449 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.934474 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:16 crc kubenswrapper[4835]: I1002 10:56:16.934502 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:16Z","lastTransitionTime":"2025-10-02T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.038180 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.038280 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.038300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.038327 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.038348 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:17Z","lastTransitionTime":"2025-10-02T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.141903 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.141966 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.141983 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.142009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.142028 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:17Z","lastTransitionTime":"2025-10-02T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.245761 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.245819 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.245830 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.245853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.245867 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:17Z","lastTransitionTime":"2025-10-02T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.251439 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:17 crc kubenswrapper[4835]: E1002 10:56:17.251650 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.349120 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.349196 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.349215 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.349290 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.349308 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:17Z","lastTransitionTime":"2025-10-02T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.453952 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.454037 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.454060 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.454094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.454118 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:17Z","lastTransitionTime":"2025-10-02T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.558194 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.558322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.558348 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.558372 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.558390 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:17Z","lastTransitionTime":"2025-10-02T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.662101 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.662178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.662202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.662269 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.662295 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:17Z","lastTransitionTime":"2025-10-02T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.766058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.766131 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.766151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.766181 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.766202 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:17Z","lastTransitionTime":"2025-10-02T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.869960 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.870003 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.870014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.870032 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.870044 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:17Z","lastTransitionTime":"2025-10-02T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.973817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.973880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.973891 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.973911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:17 crc kubenswrapper[4835]: I1002 10:56:17.973923 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:17Z","lastTransitionTime":"2025-10-02T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.077419 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.077483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.077501 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.077528 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.077547 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:18Z","lastTransitionTime":"2025-10-02T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.181085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.181190 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.181211 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.181271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.181292 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:18Z","lastTransitionTime":"2025-10-02T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.251866 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.251916 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:18 crc kubenswrapper[4835]: E1002 10:56:18.252123 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.252187 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:18 crc kubenswrapper[4835]: E1002 10:56:18.252407 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:18 crc kubenswrapper[4835]: E1002 10:56:18.252588 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.284692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.284755 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.284766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.284788 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.284800 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:18Z","lastTransitionTime":"2025-10-02T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.387954 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.388010 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.388027 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.388049 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.388062 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:18Z","lastTransitionTime":"2025-10-02T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.491033 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.491085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.491095 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.491122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.491136 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:18Z","lastTransitionTime":"2025-10-02T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.595717 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.595781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.595799 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.595822 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.595840 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:18Z","lastTransitionTime":"2025-10-02T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.700511 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.700598 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.700626 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.700661 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.700684 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:18Z","lastTransitionTime":"2025-10-02T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.803154 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.803265 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.803285 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.803314 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.803333 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:18Z","lastTransitionTime":"2025-10-02T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.906960 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.907008 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.907018 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.907035 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:18 crc kubenswrapper[4835]: I1002 10:56:18.907049 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:18Z","lastTransitionTime":"2025-10-02T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.010112 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.010199 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.010213 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.010270 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.010285 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:19Z","lastTransitionTime":"2025-10-02T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.114717 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.114786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.114802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.114827 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.114846 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:19Z","lastTransitionTime":"2025-10-02T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.217853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.217902 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.217919 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.217944 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.217959 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:19Z","lastTransitionTime":"2025-10-02T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.251288 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:19 crc kubenswrapper[4835]: E1002 10:56:19.251520 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.321082 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.321144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.321162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.321189 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.321213 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:19Z","lastTransitionTime":"2025-10-02T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.423993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.424031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.424041 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.424061 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.424072 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:19Z","lastTransitionTime":"2025-10-02T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.528452 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.528510 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.528525 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.528545 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.528560 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:19Z","lastTransitionTime":"2025-10-02T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.631780 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.631843 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.631854 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.631873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.631885 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:19Z","lastTransitionTime":"2025-10-02T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.664608 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.678991 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.684109 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.699811 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.726204 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.734866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.734911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.734922 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.734941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.734955 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:19Z","lastTransitionTime":"2025-10-02T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.743367 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.761485 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.787548 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.799846 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.811510 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.827838 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.838140 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.838185 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.838203 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.838271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.838286 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:19Z","lastTransitionTime":"2025-10-02T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.845541 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.859915 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.877174 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.892758 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.908147 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.921056 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.932315 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.941420 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.941488 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.941505 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.941530 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:19 crc kubenswrapper[4835]: I1002 10:56:19.941547 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:19Z","lastTransitionTime":"2025-10-02T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.044493 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.044551 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.044564 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.044588 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.044603 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:20Z","lastTransitionTime":"2025-10-02T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.147214 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.147289 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.147299 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.147323 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.147334 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:20Z","lastTransitionTime":"2025-10-02T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.250162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.250251 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.250279 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.250306 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.250323 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:20Z","lastTransitionTime":"2025-10-02T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.250894 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.250964 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:20 crc kubenswrapper[4835]: E1002 10:56:20.251020 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.251075 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:20 crc kubenswrapper[4835]: E1002 10:56:20.251131 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:20 crc kubenswrapper[4835]: E1002 10:56:20.251277 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.353327 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.353383 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.353399 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.353421 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.353438 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:20Z","lastTransitionTime":"2025-10-02T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.456189 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.456282 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.456297 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.456319 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.456333 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:20Z","lastTransitionTime":"2025-10-02T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.559488 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.559725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.559739 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.559782 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.559797 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:20Z","lastTransitionTime":"2025-10-02T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.662349 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.662388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.662399 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.662415 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.662426 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:20Z","lastTransitionTime":"2025-10-02T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.765338 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.765405 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.765417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.765440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.765451 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:20Z","lastTransitionTime":"2025-10-02T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.869386 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.869471 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.869514 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.869555 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.869584 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:20Z","lastTransitionTime":"2025-10-02T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.973749 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.973827 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.973855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.973891 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:20 crc kubenswrapper[4835]: I1002 10:56:20.973919 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:20Z","lastTransitionTime":"2025-10-02T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.077852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.077924 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.077941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.077961 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.077974 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:21Z","lastTransitionTime":"2025-10-02T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.181392 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.181451 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.181463 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.181495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.181512 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:21Z","lastTransitionTime":"2025-10-02T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.251094 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:21 crc kubenswrapper[4835]: E1002 10:56:21.251331 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.284113 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.284167 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.284182 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.284200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.284212 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:21Z","lastTransitionTime":"2025-10-02T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.387425 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.387468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.387477 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.387494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.387504 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:21Z","lastTransitionTime":"2025-10-02T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.492385 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.492482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.492507 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.492542 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.492564 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:21Z","lastTransitionTime":"2025-10-02T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.596559 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.596637 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.596654 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.596684 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.596703 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:21Z","lastTransitionTime":"2025-10-02T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.699867 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.699933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.699949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.699982 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.700000 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:21Z","lastTransitionTime":"2025-10-02T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.803113 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.803186 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.803209 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.803279 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.803301 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:21Z","lastTransitionTime":"2025-10-02T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.906944 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.906997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.907009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.907029 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:21 crc kubenswrapper[4835]: I1002 10:56:21.907043 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:21Z","lastTransitionTime":"2025-10-02T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.010348 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.010388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.010399 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.010417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.010429 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:22Z","lastTransitionTime":"2025-10-02T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.058689 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.083360 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.102803 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.114320 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.114397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.114426 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.114459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.114484 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:22Z","lastTransitionTime":"2025-10-02T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.119402 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.134198 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.154289 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.179000 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.194752 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.212861 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.217034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.217077 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.217088 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.217108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.217123 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:22Z","lastTransitionTime":"2025-10-02T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.231591 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.250913 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.250963 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.250993 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:22 crc kubenswrapper[4835]: E1002 10:56:22.251077 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:22 crc kubenswrapper[4835]: E1002 10:56:22.251266 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:22 crc kubenswrapper[4835]: E1002 10:56:22.251476 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.256736 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.271977 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.285842 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"215087b8-4281-4808-b6b7-713a1b52987a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.298692 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.311713 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.319530 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.319581 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.319596 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.319618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.319638 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:22Z","lastTransitionTime":"2025-10-02T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.325637 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.338175 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.351861 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.422825 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.422884 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.422898 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.422917 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.422929 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:22Z","lastTransitionTime":"2025-10-02T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.526074 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.526138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.526147 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.526186 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.526200 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:22Z","lastTransitionTime":"2025-10-02T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.629286 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.629350 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.629370 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.629396 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.629414 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:22Z","lastTransitionTime":"2025-10-02T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.731425 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.731467 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.731479 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.731495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.731506 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:22Z","lastTransitionTime":"2025-10-02T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.834946 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.835007 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.835021 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.835043 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.835056 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:22Z","lastTransitionTime":"2025-10-02T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.939211 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.939312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.939337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.939371 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:22 crc kubenswrapper[4835]: I1002 10:56:22.939394 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:22Z","lastTransitionTime":"2025-10-02T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.043006 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.043085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.043105 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.043136 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.043155 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:23Z","lastTransitionTime":"2025-10-02T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.146580 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.146623 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.146652 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.146673 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.146683 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:23Z","lastTransitionTime":"2025-10-02T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.249405 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.249475 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.249491 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.249514 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.249528 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:23Z","lastTransitionTime":"2025-10-02T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.251649 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:23 crc kubenswrapper[4835]: E1002 10:56:23.251794 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.353401 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.353862 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.354014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.354155 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.354357 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:23Z","lastTransitionTime":"2025-10-02T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.457302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.457366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.457380 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.457403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.457417 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:23Z","lastTransitionTime":"2025-10-02T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.561692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.562122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.562290 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.562438 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.562657 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:23Z","lastTransitionTime":"2025-10-02T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.666622 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.666908 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.666974 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.667048 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.667112 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:23Z","lastTransitionTime":"2025-10-02T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.770776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.770834 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.770853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.770884 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.770905 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:23Z","lastTransitionTime":"2025-10-02T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.874329 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.874381 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.874389 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.874413 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.874424 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:23Z","lastTransitionTime":"2025-10-02T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.978640 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.978706 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.978724 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.978747 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:23 crc kubenswrapper[4835]: I1002 10:56:23.978767 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:23Z","lastTransitionTime":"2025-10-02T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.082414 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.082463 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.082479 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.082503 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.082520 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:24Z","lastTransitionTime":"2025-10-02T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.186214 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.186341 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.186363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.186393 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.186413 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:24Z","lastTransitionTime":"2025-10-02T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.250896 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.250943 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:24 crc kubenswrapper[4835]: E1002 10:56:24.251110 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.251211 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:24 crc kubenswrapper[4835]: E1002 10:56:24.251486 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:24 crc kubenswrapper[4835]: E1002 10:56:24.251859 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.268509 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.289051 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.290660 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.290740 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.290764 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.290801 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.290826 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:24Z","lastTransitionTime":"2025-10-02T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.321273 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.335803 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.350724 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.364550 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"215087b8-4281-4808-b6b7-713a1b52987a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.380437 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.392404 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.392447 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.392456 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.392474 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.392485 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:24Z","lastTransitionTime":"2025-10-02T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.400457 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.415584 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.428561 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.453720 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.468050 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.481584 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.492976 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.494868 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.494906 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.494917 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.494936 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.494947 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:24Z","lastTransitionTime":"2025-10-02T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.508037 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.522807 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.538665 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.597816 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.597863 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.597875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.597894 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.597904 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:24Z","lastTransitionTime":"2025-10-02T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.700426 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.700468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.700477 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.700494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.700532 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:24Z","lastTransitionTime":"2025-10-02T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.803108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.803163 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.803180 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.803202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.803214 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:24Z","lastTransitionTime":"2025-10-02T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.906015 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.906054 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.906069 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.906098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.906112 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:24Z","lastTransitionTime":"2025-10-02T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.993774 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.993814 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.993824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.993842 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:24 crc kubenswrapper[4835]: I1002 10:56:24.993854 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:24Z","lastTransitionTime":"2025-10-02T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: E1002 10:56:25.009340 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.013985 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.014073 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.014086 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.014106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.014116 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: E1002 10:56:25.026594 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.030451 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.030509 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.030521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.030542 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.030555 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: E1002 10:56:25.043838 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.054362 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.054426 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.054438 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.054617 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.054637 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: E1002 10:56:25.070866 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.074594 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.074623 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.074631 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.074648 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.074658 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: E1002 10:56:25.087478 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:25 crc kubenswrapper[4835]: E1002 10:56:25.087601 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.089403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.089431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.089438 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.089456 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.089466 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.191952 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.192018 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.192031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.192070 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.192084 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.251363 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:25 crc kubenswrapper[4835]: E1002 10:56:25.251543 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.294609 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.294656 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.294668 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.294686 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.294696 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.396816 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.396859 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.396869 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.396887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.396902 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.499147 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.499191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.499201 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.499235 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.499247 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.603081 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.603145 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.603164 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.603191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.603210 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.705794 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.705845 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.705853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.705873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.705884 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.808320 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.808363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.808371 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.808388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.808399 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.912327 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.912807 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.912817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.912843 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:25 crc kubenswrapper[4835]: I1002 10:56:25.912856 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:25Z","lastTransitionTime":"2025-10-02T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.016194 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.016263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.016273 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.016295 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.016306 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:26Z","lastTransitionTime":"2025-10-02T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.120316 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.120366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.120381 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.120401 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.120412 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:26Z","lastTransitionTime":"2025-10-02T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.223685 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.223738 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.223752 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.223776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.223792 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:26Z","lastTransitionTime":"2025-10-02T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.251836 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.251836 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:26 crc kubenswrapper[4835]: E1002 10:56:26.251995 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.251862 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:26 crc kubenswrapper[4835]: E1002 10:56:26.252233 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:26 crc kubenswrapper[4835]: E1002 10:56:26.252322 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.326769 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.326817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.326830 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.326854 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.326868 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:26Z","lastTransitionTime":"2025-10-02T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.429354 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.429406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.429421 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.429442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.429455 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:26Z","lastTransitionTime":"2025-10-02T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.532449 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.532587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.532608 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.532638 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.532657 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:26Z","lastTransitionTime":"2025-10-02T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.635324 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.635372 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.635382 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.635401 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.635414 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:26Z","lastTransitionTime":"2025-10-02T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.739657 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.739741 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.739760 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.739787 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.739805 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:26Z","lastTransitionTime":"2025-10-02T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.842135 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.842206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.842252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.842281 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.842303 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:26Z","lastTransitionTime":"2025-10-02T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.944755 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.944825 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.944844 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.944873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:26 crc kubenswrapper[4835]: I1002 10:56:26.944891 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:26Z","lastTransitionTime":"2025-10-02T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.047405 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.047471 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.047482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.047504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.047517 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:27Z","lastTransitionTime":"2025-10-02T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.150331 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.150381 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.150390 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.150410 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.150421 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:27Z","lastTransitionTime":"2025-10-02T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.251082 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:27 crc kubenswrapper[4835]: E1002 10:56:27.251382 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.252884 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.252937 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.252950 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.252971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.252985 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:27Z","lastTransitionTime":"2025-10-02T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.359747 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.359784 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.359793 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.359811 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.360249 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:27Z","lastTransitionTime":"2025-10-02T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.462397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.462428 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.462436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.462452 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.462460 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:27Z","lastTransitionTime":"2025-10-02T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.565493 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.565555 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.565574 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.565601 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.565618 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:27Z","lastTransitionTime":"2025-10-02T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.667863 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.667904 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.667915 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.667933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.667944 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:27Z","lastTransitionTime":"2025-10-02T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.770077 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.770106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.770114 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.770130 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.770140 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:27Z","lastTransitionTime":"2025-10-02T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.872523 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.872568 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.872579 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.872599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.872611 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:27Z","lastTransitionTime":"2025-10-02T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.975126 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.975177 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.975189 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.975210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:27 crc kubenswrapper[4835]: I1002 10:56:27.975249 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:27Z","lastTransitionTime":"2025-10-02T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.077159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.077212 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.077246 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.077272 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.077286 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:28Z","lastTransitionTime":"2025-10-02T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.180137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.180182 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.180193 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.180215 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.180245 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:28Z","lastTransitionTime":"2025-10-02T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.251263 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.251325 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.251341 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:28 crc kubenswrapper[4835]: E1002 10:56:28.251453 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:28 crc kubenswrapper[4835]: E1002 10:56:28.251544 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:28 crc kubenswrapper[4835]: E1002 10:56:28.251679 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.252699 4835 scope.go:117] "RemoveContainer" containerID="5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693" Oct 02 10:56:28 crc kubenswrapper[4835]: E1002 10:56:28.253008 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.283180 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.283242 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.283256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.283273 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.283285 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:28Z","lastTransitionTime":"2025-10-02T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.397835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.397882 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.397898 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.397918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.397931 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:28Z","lastTransitionTime":"2025-10-02T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.500398 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.500430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.500439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.500454 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.500464 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:28Z","lastTransitionTime":"2025-10-02T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.602975 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.603004 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.603012 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.603027 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.603036 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:28Z","lastTransitionTime":"2025-10-02T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.707139 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.707196 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.707206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.707243 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.707258 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:28Z","lastTransitionTime":"2025-10-02T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.810770 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.810857 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.810906 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.810934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.810950 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:28Z","lastTransitionTime":"2025-10-02T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.914613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.914661 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.914678 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.914703 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:28 crc kubenswrapper[4835]: I1002 10:56:28.914724 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:28Z","lastTransitionTime":"2025-10-02T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.017864 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.017918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.017929 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.017950 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.017962 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:29Z","lastTransitionTime":"2025-10-02T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.120143 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.120252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.120271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.120302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.120332 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:29Z","lastTransitionTime":"2025-10-02T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.223662 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.223718 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.223732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.223750 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.223762 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:29Z","lastTransitionTime":"2025-10-02T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.251834 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:29 crc kubenswrapper[4835]: E1002 10:56:29.252137 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.326751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.326786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.326796 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.326812 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.326822 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:29Z","lastTransitionTime":"2025-10-02T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.429718 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.429795 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.429815 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.429873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.429890 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:29Z","lastTransitionTime":"2025-10-02T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.533190 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.533292 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.533312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.533342 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.533367 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:29Z","lastTransitionTime":"2025-10-02T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.636494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.636862 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.637044 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.637166 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.637321 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:29Z","lastTransitionTime":"2025-10-02T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.739913 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.739970 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.739980 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.739999 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.740010 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:29Z","lastTransitionTime":"2025-10-02T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.842620 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.842693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.842706 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.842729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.842743 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:29Z","lastTransitionTime":"2025-10-02T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.945780 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.945855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.945867 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.945887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:29 crc kubenswrapper[4835]: I1002 10:56:29.945899 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:29Z","lastTransitionTime":"2025-10-02T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.049332 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.049399 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.049416 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.049440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.049460 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:30Z","lastTransitionTime":"2025-10-02T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.155001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.155504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.155663 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.155817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.155973 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:30Z","lastTransitionTime":"2025-10-02T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.251632 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.251817 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:30 crc kubenswrapper[4835]: E1002 10:56:30.251946 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.252060 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:30 crc kubenswrapper[4835]: E1002 10:56:30.252353 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:30 crc kubenswrapper[4835]: E1002 10:56:30.252461 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.259043 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.259091 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.259107 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.259125 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.259139 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:30Z","lastTransitionTime":"2025-10-02T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.362057 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.362152 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.362179 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.362214 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.362279 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:30Z","lastTransitionTime":"2025-10-02T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.464941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.464988 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.465001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.465021 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.465033 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:30Z","lastTransitionTime":"2025-10-02T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.568095 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.568133 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.568145 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.568162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.568174 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:30Z","lastTransitionTime":"2025-10-02T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.671759 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.671824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.671836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.671857 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.671870 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:30Z","lastTransitionTime":"2025-10-02T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.789084 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.789129 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.789138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.789156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.789166 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:30Z","lastTransitionTime":"2025-10-02T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.893777 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.893816 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.893829 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.893847 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.893861 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:30Z","lastTransitionTime":"2025-10-02T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.996210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.996274 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.996288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.996310 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:30 crc kubenswrapper[4835]: I1002 10:56:30.996332 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:30Z","lastTransitionTime":"2025-10-02T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.099074 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.099141 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.099160 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.099186 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.099204 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:31Z","lastTransitionTime":"2025-10-02T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.201933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.201982 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.201992 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.202008 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.202020 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:31Z","lastTransitionTime":"2025-10-02T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.251846 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:31 crc kubenswrapper[4835]: E1002 10:56:31.251997 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.297565 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:31 crc kubenswrapper[4835]: E1002 10:56:31.297778 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:31 crc kubenswrapper[4835]: E1002 10:56:31.297891 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs podName:7fddaac1-5041-411a-8aed-e7337c06713f nodeName:}" failed. No retries permitted until 2025-10-02 10:57:03.297866675 +0000 UTC m=+99.857774256 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs") pod "network-metrics-daemon-5j5j6" (UID: "7fddaac1-5041-411a-8aed-e7337c06713f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.304381 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.304423 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.304432 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.304449 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.304459 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:31Z","lastTransitionTime":"2025-10-02T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.407318 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.407357 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.407367 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.407384 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.407395 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:31Z","lastTransitionTime":"2025-10-02T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.510289 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.510330 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.510339 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.510355 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.510366 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:31Z","lastTransitionTime":"2025-10-02T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.613762 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.613848 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.613860 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.613881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.613894 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:31Z","lastTransitionTime":"2025-10-02T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.716701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.716751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.716764 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.716784 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.716798 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:31Z","lastTransitionTime":"2025-10-02T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.819112 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.819160 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.819174 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.819189 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.819201 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:31Z","lastTransitionTime":"2025-10-02T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.921706 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.921747 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.921759 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.921775 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:31 crc kubenswrapper[4835]: I1002 10:56:31.921788 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:31Z","lastTransitionTime":"2025-10-02T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.025800 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.025858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.025869 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.025891 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.025903 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:32Z","lastTransitionTime":"2025-10-02T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.128880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.128937 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.128972 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.128996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.129011 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:32Z","lastTransitionTime":"2025-10-02T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.231460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.231523 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.231735 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.231786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.231797 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:32Z","lastTransitionTime":"2025-10-02T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.250850 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.250896 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.250920 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:32 crc kubenswrapper[4835]: E1002 10:56:32.251020 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:32 crc kubenswrapper[4835]: E1002 10:56:32.251119 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:32 crc kubenswrapper[4835]: E1002 10:56:32.251283 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.334643 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.334712 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.334727 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.334747 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.334760 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:32Z","lastTransitionTime":"2025-10-02T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.437319 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.437834 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.437916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.437996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.438134 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:32Z","lastTransitionTime":"2025-10-02T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.541881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.541937 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.541949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.541989 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.542000 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:32Z","lastTransitionTime":"2025-10-02T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.645179 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.645270 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.645290 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.645318 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.645335 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:32Z","lastTransitionTime":"2025-10-02T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.748888 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.748936 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.748948 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.748970 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.748980 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:32Z","lastTransitionTime":"2025-10-02T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.852073 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.852131 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.852196 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.852277 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.852301 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:32Z","lastTransitionTime":"2025-10-02T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.954655 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.954709 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.954722 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.954743 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:32 crc kubenswrapper[4835]: I1002 10:56:32.954756 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:32Z","lastTransitionTime":"2025-10-02T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.057592 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.057647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.057660 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.057681 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.057697 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:33Z","lastTransitionTime":"2025-10-02T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.160934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.160982 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.160993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.161012 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.161028 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:33Z","lastTransitionTime":"2025-10-02T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.251529 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:33 crc kubenswrapper[4835]: E1002 10:56:33.251736 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.263952 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.264000 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.264011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.264031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.264049 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:33Z","lastTransitionTime":"2025-10-02T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.367188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.367287 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.367300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.367325 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.367341 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:33Z","lastTransitionTime":"2025-10-02T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.472530 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.472593 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.472605 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.472626 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.472638 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:33Z","lastTransitionTime":"2025-10-02T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.576366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.576406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.576420 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.576439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.576450 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:33Z","lastTransitionTime":"2025-10-02T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.679210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.679267 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.679277 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.679295 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.679306 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:33Z","lastTransitionTime":"2025-10-02T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.781688 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.781740 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.781749 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.781766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.781776 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:33Z","lastTransitionTime":"2025-10-02T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.885667 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.885717 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.885725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.885745 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.885754 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:33Z","lastTransitionTime":"2025-10-02T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.989181 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.989266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.989284 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.989318 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:33 crc kubenswrapper[4835]: I1002 10:56:33.989339 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:33Z","lastTransitionTime":"2025-10-02T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.092112 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.092188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.092205 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.092438 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.092458 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:34Z","lastTransitionTime":"2025-10-02T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.195518 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.195586 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.195606 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.195635 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.195653 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:34Z","lastTransitionTime":"2025-10-02T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.253433 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.253493 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.253630 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:34 crc kubenswrapper[4835]: E1002 10:56:34.253638 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:34 crc kubenswrapper[4835]: E1002 10:56:34.253769 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:34 crc kubenswrapper[4835]: E1002 10:56:34.253863 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.267621 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.283043 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.299585 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.299646 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.299863 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.299889 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.299919 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.299958 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:34Z","lastTransitionTime":"2025-10-02T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.318942 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.338795 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.355990 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.373994 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.388875 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.401843 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.401870 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.401905 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.401927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.401944 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:34Z","lastTransitionTime":"2025-10-02T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.406024 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.424677 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.439732 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.454196 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.469257 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"215087b8-4281-4808-b6b7-713a1b52987a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.487417 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.506107 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.506154 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.506167 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.506188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.506202 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:34Z","lastTransitionTime":"2025-10-02T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.509520 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.526924 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.548655 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.609571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.609663 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.609682 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.609712 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.609730 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:34Z","lastTransitionTime":"2025-10-02T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.712831 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.712907 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.712926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.712959 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.712989 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:34Z","lastTransitionTime":"2025-10-02T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.771236 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2tw4v_cea2edfd-8b9c-44be-be9a-d2feb410da71/kube-multus/0.log" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.771281 4835 generic.go:334] "Generic (PLEG): container finished" podID="cea2edfd-8b9c-44be-be9a-d2feb410da71" containerID="48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b" exitCode=1 Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.771313 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2tw4v" event={"ID":"cea2edfd-8b9c-44be-be9a-d2feb410da71","Type":"ContainerDied","Data":"48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b"} Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.771707 4835 scope.go:117] "RemoveContainer" containerID="48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.788276 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.807609 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.815866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.815916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.815934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.815955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.815968 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:34Z","lastTransitionTime":"2025-10-02T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.823786 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.839325 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:34Z\\\",\\\"message\\\":\\\"2025-10-02T10:55:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb\\\\n2025-10-02T10:55:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb to /host/opt/cni/bin/\\\\n2025-10-02T10:55:49Z [verbose] multus-daemon started\\\\n2025-10-02T10:55:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:56:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.864975 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.896575 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.911670 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.919600 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.919773 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.919875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.919975 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.920083 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:34Z","lastTransitionTime":"2025-10-02T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.928137 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"215087b8-4281-4808-b6b7-713a1b52987a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.947241 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.965116 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.976699 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:34 crc kubenswrapper[4835]: I1002 10:56:34.989960 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.005606 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.024095 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.024146 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.024158 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.024212 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.024281 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.026842 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.040868 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.057084 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.069802 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.127796 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.127879 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.127904 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.127947 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.127974 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.217467 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.217534 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.217544 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.217562 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.217573 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: E1002 10:56:35.230592 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.235718 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.235800 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.235828 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.235881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.235910 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.250995 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:35 crc kubenswrapper[4835]: E1002 10:56:35.251278 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:35 crc kubenswrapper[4835]: E1002 10:56:35.254061 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.259179 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.259211 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.259235 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.259250 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.259260 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: E1002 10:56:35.275827 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.279989 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.280053 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.280068 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.280089 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.280100 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: E1002 10:56:35.299926 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.306681 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.306715 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.306728 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.306746 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.306757 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: E1002 10:56:35.320290 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: E1002 10:56:35.320574 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.322520 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.322565 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.322579 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.322600 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.322938 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.426247 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.426294 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.426313 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.426337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.426352 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.528718 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.528754 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.528762 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.528779 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.528790 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.631397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.631446 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.631457 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.631480 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.631493 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.734188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.734265 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.734278 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.734298 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.734310 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.777411 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2tw4v_cea2edfd-8b9c-44be-be9a-d2feb410da71/kube-multus/0.log" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.777495 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2tw4v" event={"ID":"cea2edfd-8b9c-44be-be9a-d2feb410da71","Type":"ContainerStarted","Data":"aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561"} Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.794264 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.810064 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.823931 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.836672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.836766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.836833 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.836860 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.836907 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.838672 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:34Z\\\",\\\"message\\\":\\\"2025-10-02T10:55:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb\\\\n2025-10-02T10:55:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb to /host/opt/cni/bin/\\\\n2025-10-02T10:55:49Z [verbose] multus-daemon started\\\\n2025-10-02T10:55:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:56:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.858478 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.878041 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.890482 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.904403 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.923113 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"215087b8-4281-4808-b6b7-713a1b52987a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.939867 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.939935 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.939951 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.939977 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.939994 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:35Z","lastTransitionTime":"2025-10-02T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.940429 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.956100 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:35 crc kubenswrapper[4835]: I1002 10:56:35.975779 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.000171 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.021351 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.043286 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.043666 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.043793 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.043889 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.043975 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:36Z","lastTransitionTime":"2025-10-02T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.049241 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.066072 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.078521 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.145895 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.145953 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.145963 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.145979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.146012 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:36Z","lastTransitionTime":"2025-10-02T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.249431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.249482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.249495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.249510 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.249520 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:36Z","lastTransitionTime":"2025-10-02T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.251785 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.251820 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.251789 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:36 crc kubenswrapper[4835]: E1002 10:56:36.251938 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:36 crc kubenswrapper[4835]: E1002 10:56:36.252057 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:36 crc kubenswrapper[4835]: E1002 10:56:36.252187 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.352453 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.352505 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.352517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.352538 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.352551 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:36Z","lastTransitionTime":"2025-10-02T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.455817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.455859 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.455870 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.455887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.455910 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:36Z","lastTransitionTime":"2025-10-02T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.558695 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.558736 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.558745 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.558761 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.558773 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:36Z","lastTransitionTime":"2025-10-02T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.661754 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.661794 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.661805 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.661821 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.661834 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:36Z","lastTransitionTime":"2025-10-02T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.766357 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.766435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.766454 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.766483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.766505 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:36Z","lastTransitionTime":"2025-10-02T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.869293 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.869341 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.869352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.869370 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.869406 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:36Z","lastTransitionTime":"2025-10-02T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.972462 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.972533 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.972555 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.972586 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:36 crc kubenswrapper[4835]: I1002 10:56:36.972606 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:36Z","lastTransitionTime":"2025-10-02T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.074468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.074519 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.074533 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.074551 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.074565 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:37Z","lastTransitionTime":"2025-10-02T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.177561 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.177622 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.177640 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.177666 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.177684 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:37Z","lastTransitionTime":"2025-10-02T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.251289 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:37 crc kubenswrapper[4835]: E1002 10:56:37.251434 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.280351 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.280397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.280406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.280424 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.280434 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:37Z","lastTransitionTime":"2025-10-02T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.383966 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.384045 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.384066 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.384092 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.384110 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:37Z","lastTransitionTime":"2025-10-02T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.486621 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.486684 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.486697 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.486719 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.486734 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:37Z","lastTransitionTime":"2025-10-02T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.589151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.589211 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.589269 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.589301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.589335 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:37Z","lastTransitionTime":"2025-10-02T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.692853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.692893 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.692902 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.692919 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.692929 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:37Z","lastTransitionTime":"2025-10-02T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.795548 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.795592 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.795601 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.795615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.795624 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:37Z","lastTransitionTime":"2025-10-02T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.897872 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.897902 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.897909 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.897923 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:37 crc kubenswrapper[4835]: I1002 10:56:37.897932 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:37Z","lastTransitionTime":"2025-10-02T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.000669 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.000698 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.000705 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.000720 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.000729 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:38Z","lastTransitionTime":"2025-10-02T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.103633 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.103686 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.103705 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.103728 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.103744 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:38Z","lastTransitionTime":"2025-10-02T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.206382 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.206429 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.206449 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.206468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.206479 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:38Z","lastTransitionTime":"2025-10-02T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.251464 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.251464 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:38 crc kubenswrapper[4835]: E1002 10:56:38.251839 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:38 crc kubenswrapper[4835]: E1002 10:56:38.251635 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.251485 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:38 crc kubenswrapper[4835]: E1002 10:56:38.251958 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.308766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.308807 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.308815 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.308832 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.308843 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:38Z","lastTransitionTime":"2025-10-02T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.411337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.411448 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.411468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.411497 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.411515 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:38Z","lastTransitionTime":"2025-10-02T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.515316 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.515599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.515635 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.515741 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.515855 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:38Z","lastTransitionTime":"2025-10-02T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.617837 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.617869 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.617878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.617892 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.617902 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:38Z","lastTransitionTime":"2025-10-02T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.720876 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.720931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.720953 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.720982 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.721003 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:38Z","lastTransitionTime":"2025-10-02T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.824016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.824080 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.824098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.824125 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.824143 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:38Z","lastTransitionTime":"2025-10-02T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.927290 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.927333 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.927347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.927368 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:38 crc kubenswrapper[4835]: I1002 10:56:38.927384 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:38Z","lastTransitionTime":"2025-10-02T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.029710 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.029752 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.029765 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.029781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.029792 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:39Z","lastTransitionTime":"2025-10-02T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.132468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.132502 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.132512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.132529 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.132540 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:39Z","lastTransitionTime":"2025-10-02T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.235434 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.235522 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.235532 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.235552 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.235566 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:39Z","lastTransitionTime":"2025-10-02T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.251748 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:39 crc kubenswrapper[4835]: E1002 10:56:39.251926 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.337986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.338029 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.338040 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.338058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.338070 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:39Z","lastTransitionTime":"2025-10-02T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.440775 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.440819 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.440831 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.440848 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.440860 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:39Z","lastTransitionTime":"2025-10-02T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.543018 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.543064 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.543073 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.543089 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.543099 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:39Z","lastTransitionTime":"2025-10-02T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.645581 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.645629 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.645640 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.645659 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.645671 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:39Z","lastTransitionTime":"2025-10-02T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.748704 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.749168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.749268 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.749345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.749421 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:39Z","lastTransitionTime":"2025-10-02T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.852495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.852573 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.852593 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.852623 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.852645 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:39Z","lastTransitionTime":"2025-10-02T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.956115 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.956455 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.956522 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.956594 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:39 crc kubenswrapper[4835]: I1002 10:56:39.956669 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:39Z","lastTransitionTime":"2025-10-02T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.059578 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.059889 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.059955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.060027 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.060142 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:40Z","lastTransitionTime":"2025-10-02T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.163556 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.163597 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.163610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.163629 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.163644 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:40Z","lastTransitionTime":"2025-10-02T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.251310 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.251351 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:40 crc kubenswrapper[4835]: E1002 10:56:40.251488 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.251396 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:40 crc kubenswrapper[4835]: E1002 10:56:40.251642 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:40 crc kubenswrapper[4835]: E1002 10:56:40.251813 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.266647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.266738 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.266751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.266776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.266787 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:40Z","lastTransitionTime":"2025-10-02T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.369068 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.369149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.369170 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.369203 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.369256 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:40Z","lastTransitionTime":"2025-10-02T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.473430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.473472 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.473483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.473502 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.473513 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:40Z","lastTransitionTime":"2025-10-02T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.576335 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.576410 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.576436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.576475 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.576501 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:40Z","lastTransitionTime":"2025-10-02T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.680510 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.680586 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.680605 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.680642 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.680661 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:40Z","lastTransitionTime":"2025-10-02T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.784356 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.784416 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.784430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.784471 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.784498 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:40Z","lastTransitionTime":"2025-10-02T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.889355 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.889794 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.889955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.890093 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.890247 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:40Z","lastTransitionTime":"2025-10-02T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.997993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.998051 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.998071 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.998093 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:40 crc kubenswrapper[4835]: I1002 10:56:40.998104 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:40Z","lastTransitionTime":"2025-10-02T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.101284 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.101338 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.101352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.101375 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.101392 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:41Z","lastTransitionTime":"2025-10-02T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.204115 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.204151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.204159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.204177 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.204189 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:41Z","lastTransitionTime":"2025-10-02T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.251484 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:41 crc kubenswrapper[4835]: E1002 10:56:41.251738 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.308740 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.308835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.308855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.308890 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.308913 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:41Z","lastTransitionTime":"2025-10-02T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.412671 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.412736 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.412751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.412772 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.412786 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:41Z","lastTransitionTime":"2025-10-02T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.516028 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.516074 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.516085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.516103 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.516114 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:41Z","lastTransitionTime":"2025-10-02T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.619820 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.619894 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.619916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.619949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.619970 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:41Z","lastTransitionTime":"2025-10-02T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.724833 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.724910 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.724928 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.724957 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.724977 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:41Z","lastTransitionTime":"2025-10-02T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.828966 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.829043 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.829061 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.829088 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.829107 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:41Z","lastTransitionTime":"2025-10-02T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.933451 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.933518 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.933537 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.933567 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:41 crc kubenswrapper[4835]: I1002 10:56:41.933585 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:41Z","lastTransitionTime":"2025-10-02T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.037619 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.037712 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.037737 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.037769 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.037790 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:42Z","lastTransitionTime":"2025-10-02T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.140820 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.141318 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.141493 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.141657 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.141812 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:42Z","lastTransitionTime":"2025-10-02T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.245133 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.245521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.245609 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.245750 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.245841 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:42Z","lastTransitionTime":"2025-10-02T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.251774 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.251774 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.251909 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:42 crc kubenswrapper[4835]: E1002 10:56:42.252136 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:42 crc kubenswrapper[4835]: E1002 10:56:42.252157 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:42 crc kubenswrapper[4835]: E1002 10:56:42.252468 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.350347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.350409 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.350433 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.350458 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.350479 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:42Z","lastTransitionTime":"2025-10-02T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.454123 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.454168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.454180 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.454200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.454214 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:42Z","lastTransitionTime":"2025-10-02T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.557613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.557680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.557700 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.557727 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.557745 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:42Z","lastTransitionTime":"2025-10-02T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.661849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.661924 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.661944 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.661977 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.662000 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:42Z","lastTransitionTime":"2025-10-02T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.766082 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.766175 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.766197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.766257 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.766281 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:42Z","lastTransitionTime":"2025-10-02T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.870204 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.870332 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.870359 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.870396 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.870424 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:42Z","lastTransitionTime":"2025-10-02T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.973751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.974044 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.974147 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.974316 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:42 crc kubenswrapper[4835]: I1002 10:56:42.974469 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:42Z","lastTransitionTime":"2025-10-02T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.078711 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.079128 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.079288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.079447 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.079604 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:43Z","lastTransitionTime":"2025-10-02T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.183331 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.183410 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.183428 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.183465 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.183489 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:43Z","lastTransitionTime":"2025-10-02T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.251287 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:43 crc kubenswrapper[4835]: E1002 10:56:43.251502 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.252782 4835 scope.go:117] "RemoveContainer" containerID="5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.287662 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.287725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.287824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.287852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.287873 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:43Z","lastTransitionTime":"2025-10-02T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.390857 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.390936 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.390957 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.390986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.391006 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:43Z","lastTransitionTime":"2025-10-02T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.493704 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.493752 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.493764 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.493786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.493800 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:43Z","lastTransitionTime":"2025-10-02T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.597908 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.597971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.597982 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.598017 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.598027 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:43Z","lastTransitionTime":"2025-10-02T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.701276 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.701330 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.701341 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.701360 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.701372 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:43Z","lastTransitionTime":"2025-10-02T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.804011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.804046 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.804056 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.804074 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.804086 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:43Z","lastTransitionTime":"2025-10-02T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.808065 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/2.log" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.810517 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3"} Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.811589 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.830913 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.845407 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.858031 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.871130 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.883639 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.896196 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.905921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.905953 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.905963 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.905977 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.905986 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:43Z","lastTransitionTime":"2025-10-02T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.908490 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.921613 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:34Z\\\",\\\"message\\\":\\\"2025-10-02T10:55:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb\\\\n2025-10-02T10:55:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb to /host/opt/cni/bin/\\\\n2025-10-02T10:55:49Z [verbose] multus-daemon started\\\\n2025-10-02T10:55:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:56:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.933203 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.954264 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.974451 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:43 crc kubenswrapper[4835]: I1002 10:56:43.994985 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:43Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.007684 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.007741 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.007758 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.007781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.007798 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:44Z","lastTransitionTime":"2025-10-02T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.008107 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.021106 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.033763 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"215087b8-4281-4808-b6b7-713a1b52987a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.047183 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.059579 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.110681 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.110802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.110829 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.110865 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.110893 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:44Z","lastTransitionTime":"2025-10-02T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.213814 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.213875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.213892 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.213919 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.213936 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:44Z","lastTransitionTime":"2025-10-02T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.251247 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:44 crc kubenswrapper[4835]: E1002 10:56:44.251392 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.251436 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.251531 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:44 crc kubenswrapper[4835]: E1002 10:56:44.251617 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:44 crc kubenswrapper[4835]: E1002 10:56:44.251708 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.271651 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.293202 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.317600 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.317648 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.317664 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.317687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.317701 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:44Z","lastTransitionTime":"2025-10-02T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.342784 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.357899 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.371678 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.384976 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"215087b8-4281-4808-b6b7-713a1b52987a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.399361 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.416277 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.420483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.420537 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.420548 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.420568 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.420580 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:44Z","lastTransitionTime":"2025-10-02T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.431358 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.444629 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.464281 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.480061 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.496724 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.513323 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.523424 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.523460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.523468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.523483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.523495 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:44Z","lastTransitionTime":"2025-10-02T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.535093 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.550517 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.571319 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:34Z\\\",\\\"message\\\":\\\"2025-10-02T10:55:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb\\\\n2025-10-02T10:55:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb to /host/opt/cni/bin/\\\\n2025-10-02T10:55:49Z [verbose] multus-daemon started\\\\n2025-10-02T10:55:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:56:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.626604 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.626678 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.626699 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.626725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.626743 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:44Z","lastTransitionTime":"2025-10-02T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.729464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.729869 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.730076 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.730345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.730566 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:44Z","lastTransitionTime":"2025-10-02T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.817572 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/3.log" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.819069 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/2.log" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.822002 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3" exitCode=1 Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.822039 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3"} Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.822073 4835 scope.go:117] "RemoveContainer" containerID="5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.823434 4835 scope.go:117] "RemoveContainer" containerID="0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3" Oct 02 10:56:44 crc kubenswrapper[4835]: E1002 10:56:44.823881 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.834577 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.834614 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.834623 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.834602 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.834642 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.834653 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:44Z","lastTransitionTime":"2025-10-02T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.846096 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.858743 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"215087b8-4281-4808-b6b7-713a1b52987a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.876608 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.893040 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.906365 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.922413 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.935132 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.939235 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.939389 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.939459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.939534 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.939597 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:44Z","lastTransitionTime":"2025-10-02T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.949730 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.962415 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.976011 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:44 crc kubenswrapper[4835]: I1002 10:56:44.989057 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.004072 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.022249 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:34Z\\\",\\\"message\\\":\\\"2025-10-02T10:55:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb\\\\n2025-10-02T10:55:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb to /host/opt/cni/bin/\\\\n2025-10-02T10:55:49Z [verbose] multus-daemon started\\\\n2025-10-02T10:55:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:56:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.040407 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.041954 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.041985 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.042023 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.042043 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.042055 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.064487 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5dd7536b4c007d5824d96b77ac1e187194fb2d1bf49c422fffa9a826f8d693\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:12Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126251 6494 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126449 6494 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 10:56:12.126633 6494 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:56:12.126788 6494 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 10:56:12.127352 6494 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:56:12.127415 6494 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:56:12.127476 6494 factory.go:656] Stopping watch factory\\\\nI1002 10:56:12.127492 6494 ovnkube.go:599] Stopped ovnkube\\\\nI1002 10:56:12.127528 6494 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:56:12.127536 6494 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:56:12.127547 6494 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1002 10:56:12.127622 6494 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:44Z\\\",\\\"message\\\":\\\":{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-oauth-apiserver/api]} name:Service_openshift-oauth-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.140:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {fe46cb89-4e54-4175-a112-1c5224cd299e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 10:56:44.209174 6862 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1002 10:56:44.209292 6862 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.079069 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.145090 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.145139 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.145150 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.145168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.145181 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.248706 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.248763 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.248775 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.248804 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.248819 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.251398 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:45 crc kubenswrapper[4835]: E1002 10:56:45.251625 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.351523 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.351578 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.351592 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.351618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.351631 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.454289 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.454405 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.454418 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.454438 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.454451 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.557206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.557260 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.557270 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.557309 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.557326 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.652038 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.652110 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.652131 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.652161 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.652182 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: E1002 10:56:45.665822 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.672616 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.672671 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.672692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.672718 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.672740 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: E1002 10:56:45.695311 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.701939 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.702009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.702030 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.702063 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.702080 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: E1002 10:56:45.723194 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.728488 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.728886 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.728954 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.729035 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.729118 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: E1002 10:56:45.744935 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.750584 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.750634 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.750649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.750671 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.750685 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: E1002 10:56:45.768740 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: E1002 10:56:45.768913 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.771090 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.771143 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.771158 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.771183 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.771200 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.827926 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/3.log" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.832943 4835 scope.go:117] "RemoveContainer" containerID="0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3" Oct 02 10:56:45 crc kubenswrapper[4835]: E1002 10:56:45.833174 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.855346 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.874712 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.874781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.874798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.874829 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.874847 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.879357 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.895024 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.909672 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.926067 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"215087b8-4281-4808-b6b7-713a1b52987a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.945525 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.966762 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.978335 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.978394 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.978409 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.978431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.978447 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:45Z","lastTransitionTime":"2025-10-02T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:45 crc kubenswrapper[4835]: I1002 10:56:45.988842 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.011865 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.031810 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.051729 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.068293 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.082262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.082309 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.082322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.082341 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.082355 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:46Z","lastTransitionTime":"2025-10-02T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.085348 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.098879 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.120641 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:34Z\\\",\\\"message\\\":\\\"2025-10-02T10:55:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb\\\\n2025-10-02T10:55:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb to /host/opt/cni/bin/\\\\n2025-10-02T10:55:49Z [verbose] multus-daemon started\\\\n2025-10-02T10:55:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:56:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.141553 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.166797 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:44Z\\\",\\\"message\\\":\\\":{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-oauth-apiserver/api]} name:Service_openshift-oauth-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.140:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {fe46cb89-4e54-4175-a112-1c5224cd299e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 10:56:44.209174 6862 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1002 10:56:44.209292 6862 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.186495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.186541 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.186554 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.186577 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.186594 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:46Z","lastTransitionTime":"2025-10-02T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.252544 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.252675 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.252562 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:46 crc kubenswrapper[4835]: E1002 10:56:46.252780 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:46 crc kubenswrapper[4835]: E1002 10:56:46.252946 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:46 crc kubenswrapper[4835]: E1002 10:56:46.253056 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.289611 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.289680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.289699 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.289727 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.289748 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:46Z","lastTransitionTime":"2025-10-02T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.393526 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.393600 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.393615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.393641 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.393658 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:46Z","lastTransitionTime":"2025-10-02T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.498377 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.498462 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.498490 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.498526 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.498552 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:46Z","lastTransitionTime":"2025-10-02T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.602594 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.602678 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.602704 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.602734 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.602754 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:46Z","lastTransitionTime":"2025-10-02T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.705726 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.705786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.705795 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.705811 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.705822 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:46Z","lastTransitionTime":"2025-10-02T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.810376 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.810441 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.810459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.810487 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.810506 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:46Z","lastTransitionTime":"2025-10-02T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.913662 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.913733 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.913757 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.913789 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:46 crc kubenswrapper[4835]: I1002 10:56:46.913813 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:46Z","lastTransitionTime":"2025-10-02T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.016723 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.016828 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.016853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.016881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.016901 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:47Z","lastTransitionTime":"2025-10-02T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.121180 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.121732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.121751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.121778 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.121799 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:47Z","lastTransitionTime":"2025-10-02T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.225849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.225907 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.225927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.225958 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.225978 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:47Z","lastTransitionTime":"2025-10-02T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.253340 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:47 crc kubenswrapper[4835]: E1002 10:56:47.253657 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.269113 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.328812 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.328857 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.328871 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.328887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.328899 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:47Z","lastTransitionTime":"2025-10-02T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.432106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.432176 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.432195 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.432266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.432292 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:47Z","lastTransitionTime":"2025-10-02T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.535782 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.535859 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.535885 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.535918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.535941 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:47Z","lastTransitionTime":"2025-10-02T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.639585 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.639645 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.639660 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.639680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.639696 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:47Z","lastTransitionTime":"2025-10-02T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.743165 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.743248 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.743262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.743285 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.743298 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:47Z","lastTransitionTime":"2025-10-02T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.845969 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.846047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.846072 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.846106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.846125 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:47Z","lastTransitionTime":"2025-10-02T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.948673 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.948714 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.948724 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.948740 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.948751 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:47Z","lastTransitionTime":"2025-10-02T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.999177 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:56:47 crc kubenswrapper[4835]: E1002 10:56:47.999382 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.999349126 +0000 UTC m=+148.559256717 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.999440 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:47 crc kubenswrapper[4835]: I1002 10:56:47.999483 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:47 crc kubenswrapper[4835]: E1002 10:56:47.999557 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:56:47 crc kubenswrapper[4835]: E1002 10:56:47.999603 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:56:47 crc kubenswrapper[4835]: E1002 10:56:47.999617 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.999601954 +0000 UTC m=+148.559509535 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:56:47 crc kubenswrapper[4835]: E1002 10:56:47.999643 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.999634985 +0000 UTC m=+148.559542656 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.051295 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.051329 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.051337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.051351 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.051361 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:48Z","lastTransitionTime":"2025-10-02T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.100624 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.100720 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:48 crc kubenswrapper[4835]: E1002 10:56:48.100938 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:56:48 crc kubenswrapper[4835]: E1002 10:56:48.100970 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:56:48 crc kubenswrapper[4835]: E1002 10:56:48.100966 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:56:48 crc kubenswrapper[4835]: E1002 10:56:48.101047 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:56:48 crc kubenswrapper[4835]: E1002 10:56:48.101075 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:48 crc kubenswrapper[4835]: E1002 10:56:48.100987 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:48 crc kubenswrapper[4835]: E1002 10:56:48.101156 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:57:52.101130226 +0000 UTC m=+148.661037837 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:48 crc kubenswrapper[4835]: E1002 10:56:48.101212 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:57:52.101193838 +0000 UTC m=+148.661101429 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.154518 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.154580 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.154595 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.154624 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.154639 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:48Z","lastTransitionTime":"2025-10-02T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.251164 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.251260 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.251299 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:48 crc kubenswrapper[4835]: E1002 10:56:48.251447 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:48 crc kubenswrapper[4835]: E1002 10:56:48.251613 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:48 crc kubenswrapper[4835]: E1002 10:56:48.251789 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.257617 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.257652 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.257664 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.257683 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.257694 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:48Z","lastTransitionTime":"2025-10-02T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.360650 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.360729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.360749 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.360779 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.360802 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:48Z","lastTransitionTime":"2025-10-02T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.463243 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.463292 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.463304 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.463323 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.463336 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:48Z","lastTransitionTime":"2025-10-02T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.566021 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.566060 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.566072 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.566091 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.566104 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:48Z","lastTransitionTime":"2025-10-02T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.669350 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.669453 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.669483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.669565 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.669594 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:48Z","lastTransitionTime":"2025-10-02T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.772482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.772537 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.772551 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.772573 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.772593 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:48Z","lastTransitionTime":"2025-10-02T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.875750 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.875825 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.875844 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.875873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.875889 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:48Z","lastTransitionTime":"2025-10-02T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.979543 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.979639 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.979667 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.979702 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:48 crc kubenswrapper[4835]: I1002 10:56:48.979728 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:48Z","lastTransitionTime":"2025-10-02T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.082904 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.082994 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.083018 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.083054 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.083079 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:49Z","lastTransitionTime":"2025-10-02T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.186542 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.186607 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.186622 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.186645 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.186660 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:49Z","lastTransitionTime":"2025-10-02T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.251018 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:49 crc kubenswrapper[4835]: E1002 10:56:49.251409 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.289825 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.289971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.290002 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.290036 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.290058 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:49Z","lastTransitionTime":"2025-10-02T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.393357 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.393440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.393459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.393485 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.393502 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:49Z","lastTransitionTime":"2025-10-02T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.495717 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.495757 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.495766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.495781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.495789 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:49Z","lastTransitionTime":"2025-10-02T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.599271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.599349 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.599372 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.599404 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.599427 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:49Z","lastTransitionTime":"2025-10-02T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.701855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.702336 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.702509 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.702664 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.702806 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:49Z","lastTransitionTime":"2025-10-02T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.806257 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.806314 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.806339 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.806368 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.806391 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:49Z","lastTransitionTime":"2025-10-02T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.908990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.909370 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.909410 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.909436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:49 crc kubenswrapper[4835]: I1002 10:56:49.909458 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:49Z","lastTransitionTime":"2025-10-02T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.011911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.011969 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.011988 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.012013 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.012033 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:50Z","lastTransitionTime":"2025-10-02T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.115256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.115379 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.115398 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.115422 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.115438 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:50Z","lastTransitionTime":"2025-10-02T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.219265 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.219337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.219353 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.219377 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.219391 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:50Z","lastTransitionTime":"2025-10-02T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.251394 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.251415 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.251549 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:50 crc kubenswrapper[4835]: E1002 10:56:50.251652 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:50 crc kubenswrapper[4835]: E1002 10:56:50.251905 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:50 crc kubenswrapper[4835]: E1002 10:56:50.252195 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.322002 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.322050 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.322059 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.322074 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.322084 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:50Z","lastTransitionTime":"2025-10-02T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.425719 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.425778 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.425792 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.425823 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.425836 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:50Z","lastTransitionTime":"2025-10-02T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.529016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.529089 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.529106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.529137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.529160 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:50Z","lastTransitionTime":"2025-10-02T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.632288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.632400 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.632468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.632517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.632541 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:50Z","lastTransitionTime":"2025-10-02T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.736198 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.736281 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.736299 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.736323 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.736339 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:50Z","lastTransitionTime":"2025-10-02T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.839825 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.839955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.839975 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.840005 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.840026 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:50Z","lastTransitionTime":"2025-10-02T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.943601 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.943676 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.943693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.943722 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:50 crc kubenswrapper[4835]: I1002 10:56:50.943738 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:50Z","lastTransitionTime":"2025-10-02T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.046318 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.046362 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.046370 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.046386 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.046395 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:51Z","lastTransitionTime":"2025-10-02T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.149306 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.149469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.149510 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.149547 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.149570 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:51Z","lastTransitionTime":"2025-10-02T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.251740 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:51 crc kubenswrapper[4835]: E1002 10:56:51.251956 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.254051 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.254098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.254113 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.254136 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.254155 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:51Z","lastTransitionTime":"2025-10-02T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.357116 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.357177 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.357201 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.357247 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.357263 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:51Z","lastTransitionTime":"2025-10-02T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.459788 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.459841 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.459852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.459870 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.459881 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:51Z","lastTransitionTime":"2025-10-02T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.565533 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.565683 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.565745 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.565779 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.565801 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:51Z","lastTransitionTime":"2025-10-02T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.668882 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.668961 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.668984 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.669058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.669109 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:51Z","lastTransitionTime":"2025-10-02T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.773847 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.773922 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.773935 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.773959 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.773976 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:51Z","lastTransitionTime":"2025-10-02T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.878168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.878268 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.878283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.878312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.878324 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:51Z","lastTransitionTime":"2025-10-02T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.981786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.981838 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.981849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.981873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:51 crc kubenswrapper[4835]: I1002 10:56:51.981884 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:51Z","lastTransitionTime":"2025-10-02T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.084677 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.084725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.084744 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.084766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.084780 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:52Z","lastTransitionTime":"2025-10-02T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.187838 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.187878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.187887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.187901 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.187910 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:52Z","lastTransitionTime":"2025-10-02T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.252590 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.252657 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.252667 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:52 crc kubenswrapper[4835]: E1002 10:56:52.252740 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:52 crc kubenswrapper[4835]: E1002 10:56:52.253010 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:52 crc kubenswrapper[4835]: E1002 10:56:52.253431 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.290163 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.290267 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.290294 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.290328 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.290356 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:52Z","lastTransitionTime":"2025-10-02T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.392961 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.393112 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.393137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.393168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.393190 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:52Z","lastTransitionTime":"2025-10-02T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.497104 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.497171 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.497194 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.497265 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.497286 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:52Z","lastTransitionTime":"2025-10-02T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.600024 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.600102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.600120 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.600145 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.600163 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:52Z","lastTransitionTime":"2025-10-02T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.703717 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.703819 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.703847 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.703887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.703941 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:52Z","lastTransitionTime":"2025-10-02T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.807420 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.807521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.807543 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.807575 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.807595 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:52Z","lastTransitionTime":"2025-10-02T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.911653 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.911721 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.911741 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.911769 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:52 crc kubenswrapper[4835]: I1002 10:56:52.911787 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:52Z","lastTransitionTime":"2025-10-02T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.015302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.015366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.015389 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.015416 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.015435 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:53Z","lastTransitionTime":"2025-10-02T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.118467 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.119344 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.119397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.119427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.119452 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:53Z","lastTransitionTime":"2025-10-02T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.223433 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.223509 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.223552 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.223579 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.223596 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:53Z","lastTransitionTime":"2025-10-02T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.251869 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:53 crc kubenswrapper[4835]: E1002 10:56:53.252120 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.326435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.326506 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.326517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.326553 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.326565 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:53Z","lastTransitionTime":"2025-10-02T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.429763 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.429828 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.429867 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.429904 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.429923 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:53Z","lastTransitionTime":"2025-10-02T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.532687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.532759 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.532777 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.532802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.532819 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:53Z","lastTransitionTime":"2025-10-02T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.636152 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.636254 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.636280 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.636311 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.636332 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:53Z","lastTransitionTime":"2025-10-02T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.739703 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.739772 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.739793 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.739823 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.739843 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:53Z","lastTransitionTime":"2025-10-02T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.843064 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.843149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.843178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.843210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.843277 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:53Z","lastTransitionTime":"2025-10-02T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.946855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.946918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.946935 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.946960 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:53 crc kubenswrapper[4835]: I1002 10:56:53.946987 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:53Z","lastTransitionTime":"2025-10-02T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.050478 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.050535 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.050554 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.050583 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.050599 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:54Z","lastTransitionTime":"2025-10-02T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.153831 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.153962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.153987 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.154025 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.154041 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:54Z","lastTransitionTime":"2025-10-02T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.251356 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.251373 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.251642 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:54 crc kubenswrapper[4835]: E1002 10:56:54.251827 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:54 crc kubenswrapper[4835]: E1002 10:56:54.252003 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:54 crc kubenswrapper[4835]: E1002 10:56:54.252144 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.257611 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.257661 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.257679 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.257702 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.257722 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:54Z","lastTransitionTime":"2025-10-02T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.275317 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.296006 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.312979 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.347411 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0dc7d77a-95e5-4f89-9e76-2eb57d4a0425\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09de897d1ce9009170b27f1e9d924acc713deeb1c11be5b19c573ba4df2d255f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://925933106ef051b2fe7dbd61758717f24d3f8370778fcfec65373884e7b09862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ed698d93bdcaf46bed5745e06f66ebef02a88d50802e5f7a304b4dba7b31a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0938eca70f2e31639a8168c94539e7a2ef5bacbc77795d117ce0392d9a4b52e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f781dffd3ee8eb8211c6663574642acd7de8cfe14b99e1c63d65d4c7c19f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02d4e31ef3f58036a2fe6c09f70020f137b6b9bbabcd3d9e6ecb39e7b38bc380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d4e31ef3f58036a2fe6c09f70020f137b6b9bbabcd3d9e6ecb39e7b38bc380\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6920ca4b62b0e37fe1fa3b7ed75f509a4dbecc41e23ea52c1366a3b3f82f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba6920ca4b62b0e37fe1fa3b7ed75f509a4dbecc41e23ea52c1366a3b3f82f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://54b5275fb5b1530b809ab177485fd261602e5008be9c28381f72fac7ef0fdc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54b5275fb5b1530b809ab177485fd261602e5008be9c28381f72fac7ef0fdc85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.361492 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.361592 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.361619 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.361650 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.361672 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:54Z","lastTransitionTime":"2025-10-02T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.373555 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.397885 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.420379 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.440820 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.464237 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.464286 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.464299 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.464322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.464340 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:54Z","lastTransitionTime":"2025-10-02T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.471874 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:44Z\\\",\\\"message\\\":\\\":{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-oauth-apiserver/api]} name:Service_openshift-oauth-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.140:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {fe46cb89-4e54-4175-a112-1c5224cd299e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 10:56:44.209174 6862 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1002 10:56:44.209292 6862 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.490365 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.508298 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:34Z\\\",\\\"message\\\":\\\"2025-10-02T10:55:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb\\\\n2025-10-02T10:55:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb to /host/opt/cni/bin/\\\\n2025-10-02T10:55:49Z [verbose] multus-daemon started\\\\n2025-10-02T10:55:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:56:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.527619 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.547379 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.568625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.568730 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.568748 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.568780 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.568803 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:54Z","lastTransitionTime":"2025-10-02T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.569621 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.589339 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.604901 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.623140 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.647158 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"215087b8-4281-4808-b6b7-713a1b52987a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.671720 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.671791 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.671810 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.671840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.671858 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:54Z","lastTransitionTime":"2025-10-02T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.774210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.774273 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.774285 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.774304 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.774317 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:54Z","lastTransitionTime":"2025-10-02T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.877005 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.877079 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.877094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.877119 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.877138 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:54Z","lastTransitionTime":"2025-10-02T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.980058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.980114 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.980125 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.980144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:54 crc kubenswrapper[4835]: I1002 10:56:54.980156 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:54Z","lastTransitionTime":"2025-10-02T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.082364 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.082428 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.082446 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.082469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.082486 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:55Z","lastTransitionTime":"2025-10-02T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.185049 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.185109 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.185130 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.185158 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.185179 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:55Z","lastTransitionTime":"2025-10-02T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.251906 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:55 crc kubenswrapper[4835]: E1002 10:56:55.252126 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.287401 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.287428 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.287436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.287449 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.287458 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:55Z","lastTransitionTime":"2025-10-02T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.390556 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.390602 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.390615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.390632 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.390645 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:55Z","lastTransitionTime":"2025-10-02T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.493308 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.493378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.493396 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.493430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.493458 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:55Z","lastTransitionTime":"2025-10-02T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.596415 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.596466 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.596477 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.596498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.596512 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:55Z","lastTransitionTime":"2025-10-02T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.699099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.699160 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.699176 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.699197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.699213 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:55Z","lastTransitionTime":"2025-10-02T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.803051 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.803114 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.803132 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.803161 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.803182 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:55Z","lastTransitionTime":"2025-10-02T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.906786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.906839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.906852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.906873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:55 crc kubenswrapper[4835]: I1002 10:56:55.906886 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:55Z","lastTransitionTime":"2025-10-02T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.010738 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.010802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.010813 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.010834 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.010857 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.114045 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.114131 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.114148 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.114172 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.114188 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.136638 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.136713 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.136725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.136757 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.136774 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: E1002 10:56:56.160717 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.165271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.165357 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.165377 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.165408 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.165430 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: E1002 10:56:56.184556 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.190066 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.190149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.190168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.190197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.190212 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: E1002 10:56:56.208045 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.214329 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.214392 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.214406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.214429 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.214445 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: E1002 10:56:56.229200 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.233788 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.233881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.233901 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.233934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.233954 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.251278 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:56 crc kubenswrapper[4835]: E1002 10:56:56.251675 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.251706 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.251771 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:56 crc kubenswrapper[4835]: E1002 10:56:56.251934 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:56 crc kubenswrapper[4835]: E1002 10:56:56.252280 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:56 crc kubenswrapper[4835]: E1002 10:56:56.253363 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:56:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:56:56 crc kubenswrapper[4835]: E1002 10:56:56.253469 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.255696 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.255736 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.255750 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.255774 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.255791 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.268555 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.359304 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.359366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.359386 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.359414 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.359435 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.462951 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.463026 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.463039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.463056 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.463067 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.566816 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.566891 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.566905 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.566933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.566947 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.670751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.670822 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.670833 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.670853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.670864 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.774388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.774450 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.774463 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.774483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.774501 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.876607 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.876649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.876659 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.876677 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.876688 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.980046 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.980343 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.980410 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.980475 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:56 crc kubenswrapper[4835]: I1002 10:56:56.980536 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:56Z","lastTransitionTime":"2025-10-02T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.083858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.083939 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.083953 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.083978 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.083994 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:57Z","lastTransitionTime":"2025-10-02T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.187685 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.187746 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.187756 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.187803 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.187816 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:57Z","lastTransitionTime":"2025-10-02T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.251720 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:57 crc kubenswrapper[4835]: E1002 10:56:57.252495 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.252777 4835 scope.go:117] "RemoveContainer" containerID="0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3" Oct 02 10:56:57 crc kubenswrapper[4835]: E1002 10:56:57.252984 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.291085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.291119 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.291126 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.291141 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.291152 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:57Z","lastTransitionTime":"2025-10-02T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.393729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.393808 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.393820 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.393845 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.393859 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:57Z","lastTransitionTime":"2025-10-02T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.496878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.496931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.496943 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.496962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.496975 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:57Z","lastTransitionTime":"2025-10-02T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.599955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.600025 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.600035 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.600057 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.600069 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:57Z","lastTransitionTime":"2025-10-02T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.705333 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.705391 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.705403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.705419 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.705431 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:57Z","lastTransitionTime":"2025-10-02T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.808367 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.808415 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.808426 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.808445 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.808456 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:57Z","lastTransitionTime":"2025-10-02T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.910753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.910829 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.910848 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.910876 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:57 crc kubenswrapper[4835]: I1002 10:56:57.910894 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:57Z","lastTransitionTime":"2025-10-02T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.013414 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.013468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.013478 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.013496 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.013507 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:58Z","lastTransitionTime":"2025-10-02T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.116122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.116171 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.116183 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.116203 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.116215 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:58Z","lastTransitionTime":"2025-10-02T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.218812 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.218872 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.218884 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.218913 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.218932 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:58Z","lastTransitionTime":"2025-10-02T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.251586 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.251643 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.251655 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:56:58 crc kubenswrapper[4835]: E1002 10:56:58.251756 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:56:58 crc kubenswrapper[4835]: E1002 10:56:58.251860 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:56:58 crc kubenswrapper[4835]: E1002 10:56:58.251990 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.320877 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.320918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.320926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.320941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.320950 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:58Z","lastTransitionTime":"2025-10-02T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.423658 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.423762 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.423782 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.423823 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.423859 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:58Z","lastTransitionTime":"2025-10-02T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.526695 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.526807 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.526828 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.526852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.526869 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:58Z","lastTransitionTime":"2025-10-02T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.629443 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.629484 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.629493 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.629510 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.629519 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:58Z","lastTransitionTime":"2025-10-02T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.735750 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.735824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.735839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.735860 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.735877 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:58Z","lastTransitionTime":"2025-10-02T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.838332 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.838368 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.838376 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.838389 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.838399 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:58Z","lastTransitionTime":"2025-10-02T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.941623 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.941714 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.941729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.941754 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:58 crc kubenswrapper[4835]: I1002 10:56:58.941797 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:58Z","lastTransitionTime":"2025-10-02T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.043836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.043910 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.043926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.043957 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.043979 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:59Z","lastTransitionTime":"2025-10-02T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.146671 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.146715 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.146726 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.146762 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.146775 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:59Z","lastTransitionTime":"2025-10-02T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.249624 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.249691 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.249707 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.249732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.249752 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:59Z","lastTransitionTime":"2025-10-02T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.250739 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:56:59 crc kubenswrapper[4835]: E1002 10:56:59.250846 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.352957 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.352992 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.353001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.353017 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.353027 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:59Z","lastTransitionTime":"2025-10-02T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.455051 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.455083 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.455094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.455108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.455118 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:59Z","lastTransitionTime":"2025-10-02T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.557975 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.558063 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.558085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.558115 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.558133 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:59Z","lastTransitionTime":"2025-10-02T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.660172 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.660202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.660211 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.660246 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.660256 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:59Z","lastTransitionTime":"2025-10-02T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.762280 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.762331 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.762339 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.762358 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.762369 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:59Z","lastTransitionTime":"2025-10-02T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.865027 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.865062 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.865071 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.865086 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.865095 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:59Z","lastTransitionTime":"2025-10-02T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.967506 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.967554 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.967562 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.967577 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:56:59 crc kubenswrapper[4835]: I1002 10:56:59.967587 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:56:59Z","lastTransitionTime":"2025-10-02T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.069535 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.069579 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.069591 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.069608 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.069619 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:00Z","lastTransitionTime":"2025-10-02T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.171350 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.171400 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.171413 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.171430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.171442 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:00Z","lastTransitionTime":"2025-10-02T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.251542 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.251593 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:00 crc kubenswrapper[4835]: E1002 10:57:00.251709 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.251762 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:00 crc kubenswrapper[4835]: E1002 10:57:00.251862 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:00 crc kubenswrapper[4835]: E1002 10:57:00.251923 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.273352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.273652 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.273665 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.273703 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.273715 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:00Z","lastTransitionTime":"2025-10-02T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.376618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.376681 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.376692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.376710 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.376752 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:00Z","lastTransitionTime":"2025-10-02T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.479353 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.479403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.479414 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.479430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.479438 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:00Z","lastTransitionTime":"2025-10-02T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.582014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.582053 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.582062 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.582078 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.582088 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:00Z","lastTransitionTime":"2025-10-02T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.686177 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.686281 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.686296 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.686315 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.686327 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:00Z","lastTransitionTime":"2025-10-02T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.788902 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.788948 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.788956 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.788971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.788980 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:00Z","lastTransitionTime":"2025-10-02T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.891116 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.891155 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.891162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.891178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.891188 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:00Z","lastTransitionTime":"2025-10-02T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.993602 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.993646 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.993654 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.993670 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:00 crc kubenswrapper[4835]: I1002 10:57:00.993681 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:00Z","lastTransitionTime":"2025-10-02T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.096121 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.096164 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.096178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.096197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.096209 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:01Z","lastTransitionTime":"2025-10-02T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.199436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.199497 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.199516 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.199539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.199556 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:01Z","lastTransitionTime":"2025-10-02T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.251362 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:01 crc kubenswrapper[4835]: E1002 10:57:01.251687 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.302195 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.302295 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.302322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.302357 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.302386 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:01Z","lastTransitionTime":"2025-10-02T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.404965 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.405008 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.405016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.405032 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.405043 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:01Z","lastTransitionTime":"2025-10-02T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.507797 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.507840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.507850 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.507864 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.507873 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:01Z","lastTransitionTime":"2025-10-02T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.609777 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.609826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.609840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.609862 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.609874 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:01Z","lastTransitionTime":"2025-10-02T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.712651 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.712687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.712695 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.712709 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.712718 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:01Z","lastTransitionTime":"2025-10-02T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.815553 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.815596 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.815608 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.815624 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.815636 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:01Z","lastTransitionTime":"2025-10-02T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.919464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.919512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.919522 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.919539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:01 crc kubenswrapper[4835]: I1002 10:57:01.919551 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:01Z","lastTransitionTime":"2025-10-02T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.022442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.022496 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.022507 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.022523 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.022532 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:02Z","lastTransitionTime":"2025-10-02T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.125838 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.125873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.125882 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.125899 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.125908 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:02Z","lastTransitionTime":"2025-10-02T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.228102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.228144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.228153 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.228169 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.228181 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:02Z","lastTransitionTime":"2025-10-02T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.250903 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.250903 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:02 crc kubenswrapper[4835]: E1002 10:57:02.251042 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:02 crc kubenswrapper[4835]: E1002 10:57:02.251112 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.250921 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:02 crc kubenswrapper[4835]: E1002 10:57:02.251207 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.331396 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.331463 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.331486 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.331514 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.331601 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:02Z","lastTransitionTime":"2025-10-02T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.434859 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.434913 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.434926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.434944 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.435399 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:02Z","lastTransitionTime":"2025-10-02T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.537825 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.537875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.537887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.537918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.537932 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:02Z","lastTransitionTime":"2025-10-02T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.640550 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.640592 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.640603 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.640623 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.640634 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:02Z","lastTransitionTime":"2025-10-02T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.743547 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.743589 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.743599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.743650 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.743663 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:02Z","lastTransitionTime":"2025-10-02T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.846231 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.846273 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.846283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.846301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.846314 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:02Z","lastTransitionTime":"2025-10-02T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.948971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.949035 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.949048 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.949065 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:02 crc kubenswrapper[4835]: I1002 10:57:02.949075 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:02Z","lastTransitionTime":"2025-10-02T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.051718 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.051796 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.051809 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.051829 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.051843 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:03Z","lastTransitionTime":"2025-10-02T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.154187 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.154247 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.154256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.154270 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.154303 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:03Z","lastTransitionTime":"2025-10-02T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.251306 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:03 crc kubenswrapper[4835]: E1002 10:57:03.251547 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.256824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.256867 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.256879 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.256894 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.256905 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:03Z","lastTransitionTime":"2025-10-02T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.359484 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.359517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.359524 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.359539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.359548 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:03Z","lastTransitionTime":"2025-10-02T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.367968 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:03 crc kubenswrapper[4835]: E1002 10:57:03.368269 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:57:03 crc kubenswrapper[4835]: E1002 10:57:03.368332 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs podName:7fddaac1-5041-411a-8aed-e7337c06713f nodeName:}" failed. No retries permitted until 2025-10-02 10:58:07.368314855 +0000 UTC m=+163.928222436 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs") pod "network-metrics-daemon-5j5j6" (UID: "7fddaac1-5041-411a-8aed-e7337c06713f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.461826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.461895 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.461910 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.461928 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.461939 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:03Z","lastTransitionTime":"2025-10-02T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.564776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.564814 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.564824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.564839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.564850 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:03Z","lastTransitionTime":"2025-10-02T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.667151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.667190 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.667200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.667234 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.667247 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:03Z","lastTransitionTime":"2025-10-02T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.769997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.770032 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.770040 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.770054 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.770065 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:03Z","lastTransitionTime":"2025-10-02T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.872878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.872912 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.872921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.872941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.872950 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:03Z","lastTransitionTime":"2025-10-02T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.974824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.974888 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.974900 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.974917 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:03 crc kubenswrapper[4835]: I1002 10:57:03.975304 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:03Z","lastTransitionTime":"2025-10-02T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.077121 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.077146 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.077156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.077171 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.077180 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:04Z","lastTransitionTime":"2025-10-02T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.179482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.179542 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.179557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.179575 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.179587 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:04Z","lastTransitionTime":"2025-10-02T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.251602 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:04 crc kubenswrapper[4835]: E1002 10:57:04.251752 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.251773 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.251823 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:04 crc kubenswrapper[4835]: E1002 10:57:04.252058 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:04 crc kubenswrapper[4835]: E1002 10:57:04.252430 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.264924 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e98acda28ae80aa68b33201a40eafa2a59f39e434d959bd502255eb6e122d985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6150d47f7a14a624197a0ba965bc298865882229bc80135f3f5839dc650bcf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.279603 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0a44e17aad1b0c131c032781dfb1e02ce4fa7e4264ce9aa8b6093049b5bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.281359 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.281413 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.281425 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.281442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.281455 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:04Z","lastTransitionTime":"2025-10-02T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.294037 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f98828b9-0b27-4632-bfd1-d494cb8dfcfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f372d76f951abafcdde5325fb779586e686cd1a462885cbccbc4bb9e168501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5dd84d3b771ac9a2aa31f7036dd89e3c9115742020eae44757c86ad981204a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shq7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gm7l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.305841 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2tw4v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea2edfd-8b9c-44be-be9a-d2feb410da71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:34Z\\\",\\\"message\\\":\\\"2025-10-02T10:55:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb\\\\n2025-10-02T10:55:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_30e509b4-174f-4864-b9f3-ca2546b171cb to /host/opt/cni/bin/\\\\n2025-10-02T10:55:49Z [verbose] multus-daemon started\\\\n2025-10-02T10:55:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:56:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btz7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2tw4v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.321648 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e295ff08-63dc-4638-8fb6-6ee6b07ccaa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30bd7236dd88e6614c32508cdae3514b662fa6ad29341e2375e8eb61682ae859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1132707bb58dac1d43fe426fbc47b04f4c5428c6c42e4c2bf88c907fdd54b90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c99dd8f950517ec19ca7ed59f0485bd378f17119cf569198f4d8b938efb6fb8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed1bbfffa1b4fe7bed113f9f3fbb7127efc33dd73f845e309dda0a424a6cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59ddf2f44fe83b753ff45a6ab7957a79c6e7c2d7adfe6bd31a9f579c096e04cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc90cacb054e94a03dfff88596b09d453d019efe241cc789ae746e0725918ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2df66234f30983ff80df6f65108c03afce1941bebe357a3e2316a34713203a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6cwt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bjtqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.344025 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:56:44Z\\\",\\\"message\\\":\\\":{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-oauth-apiserver/api]} name:Service_openshift-oauth-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.140:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {fe46cb89-4e54-4175-a112-1c5224cd299e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 10:56:44.209174 6862 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1002 10:56:44.209292 6862 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:56:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsbdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79zgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.357070 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fddaac1-5041-411a-8aed-e7337c06713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2s4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5j5j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.369382 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"215087b8-4281-4808-b6b7-713a1b52987a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb93a4708a77d98acadf687c1881147804d82e0cc4b7bbcbba31920c6fde6777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92a657e8b796c5e1b60c5c9632636ef333a57ca66b3b5e4b86aafa893c43fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c996c6075e8d224b85ed7f54bb6d1dc1509a1fab2f1fe450379ba6a810603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://682f28c527ec3c9f912be69d794f7fecaa8232b6135b0bf6491b7d16fe8f9516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.381111 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bedc338-88b1-4c6e-9855-5f3a7298c611\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581ca8c1ec33b426223ce2b125e5e8995a3258a92c1ceaca6a2a1333bb35a164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962f1b88475c1769f073e54459d255dfacd841366c065f3263bb203e1bbbca47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://962f1b88475c1769f073e54459d255dfacd841366c065f3263bb203e1bbbca47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.383841 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.383874 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.383916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.383937 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.383949 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:04Z","lastTransitionTime":"2025-10-02T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.396728 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e36d6de-00d5-475b-a7fc-d6a88252b5e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6b4031aab5d93557c38ae808d2e87501e280f2d94ba1ea50d6ee62aa4884dd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5738d1125bbd4bf0f223fbd76df27777ca5a9f57bd3d952384802bf05b1fb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c436f35d132f2667c76e301bfc53f97cc7cbc66d75491e74b197ba88125b981\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0862694ec37eab3710530f04e9a62299082e3c87cfc0a601fc58d4b37f34e49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.410689 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1965c20bb686673dc9a25126162f01be2d6d35f0db45feb7f89fa6820707b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.424110 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.435042 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nzxcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b2bc9d-d549-47f8-a503-27f19e3b0889\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32efdb66d507064c708d94c9a23ad723025c0c52ce4b24c951d647c8b3618466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2lb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nzxcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.447100 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce0ad186-63b7-432a-a0ca-4d4cbde057a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8647b52388ff203c7fc4580c6601b31e65024071a739051c86ea8d6489d07463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpt25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5ckb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.466559 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0dc7d77a-95e5-4f89-9e76-2eb57d4a0425\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09de897d1ce9009170b27f1e9d924acc713deeb1c11be5b19c573ba4df2d255f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://925933106ef051b2fe7dbd61758717f24d3f8370778fcfec65373884e7b09862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ed698d93bdcaf46bed5745e06f66ebef02a88d50802e5f7a304b4dba7b31a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0938eca70f2e31639a8168c94539e7a2ef5bacbc77795d117ce0392d9a4b52e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f781dffd3ee8eb8211c6663574642acd7de8cfe14b99e1c63d65d4c7c19f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02d4e31ef3f58036a2fe6c09f70020f137b6b9bbabcd3d9e6ecb39e7b38bc380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d4e31ef3f58036a2fe6c09f70020f137b6b9bbabcd3d9e6ecb39e7b38bc380\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6920ca4b62b0e37fe1fa3b7ed75f509a4dbecc41e23ea52c1366a3b3f82f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba6920ca4b62b0e37fe1fa3b7ed75f509a4dbecc41e23ea52c1366a3b3f82f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://54b5275fb5b1530b809ab177485fd261602e5008be9c28381f72fac7ef0fdc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54b5275fb5b1530b809ab177485fd261602e5008be9c28381f72fac7ef0fdc85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.482397 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939674dc-dc13-4e7c-bae7-d91b3504ec8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4686d75b385e9c157d9f34f0dde7d367abb209bf0a9b5cb161ab8f0eb4c9b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8cd937fc48276e1bd728ea7388d183dc15915c19f3a55218c274c1cf9925b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a697f58456bae3cc337c099717057929ba4488f750ca4e96bffa2988f6b7c8a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832dc70f20bdb33660db5019a3073a0c357d9443c35fb33be437caae5bbded60\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa277d1ea92fae90b6d11f0ca7d4f37da715da78f6d4b0825f164fdb2f002eef\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:55:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1002 10:55:44.624145 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:55:44.624412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:55:44.626322 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2774798155/tls.crt::/tmp/serving-cert-2774798155/tls.key\\\\\\\"\\\\nI1002 10:55:45.036578 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:55:45.039594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:55:45.039615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:55:45.039639 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:55:45.039645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:55:45.044591 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:55:45.044633 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044639 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:55:45.044646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:55:45.044650 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:55:45.044654 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:55:45.044658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:55:45.044783 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:55:45.046671 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d70550315d7e4a83f8b026d27ed278b89bbac590ee1cf8ce436dbc7c1b7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73837191123f9d9dd3ff4a2d6ffcce9194b2b7798b8648c7f1f8c98170afd492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:55:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.487130 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.487174 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.487183 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.487198 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.487208 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:04Z","lastTransitionTime":"2025-10-02T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.496930 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.509311 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.519578 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpzpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"148f7288-2984-4fd9-8d43-9ee90fb4adaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d74e77360c5a0cdf01b61615b20bf67e1ba7f4314006bfa1f372b87fd59a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5m4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:55:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpzpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.589376 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.589421 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.589433 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.589449 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.589459 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:04Z","lastTransitionTime":"2025-10-02T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.692719 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.692778 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.692798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.692820 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.692840 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:04Z","lastTransitionTime":"2025-10-02T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.795027 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.795071 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.795085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.795102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.795112 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:04Z","lastTransitionTime":"2025-10-02T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.897104 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.897143 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.897155 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.897172 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.897184 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:04Z","lastTransitionTime":"2025-10-02T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.999446 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.999483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.999491 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.999505 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:04 crc kubenswrapper[4835]: I1002 10:57:04.999514 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:04Z","lastTransitionTime":"2025-10-02T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.102138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.102194 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.102281 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.102309 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.102324 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:05Z","lastTransitionTime":"2025-10-02T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.205061 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.205099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.205111 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.205127 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.205139 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:05Z","lastTransitionTime":"2025-10-02T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.250757 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:05 crc kubenswrapper[4835]: E1002 10:57:05.250896 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.308706 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.308765 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.308775 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.308798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.308810 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:05Z","lastTransitionTime":"2025-10-02T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.411850 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.411911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.411927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.411950 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.411966 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:05Z","lastTransitionTime":"2025-10-02T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.514894 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.514971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.514981 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.514996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.515009 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:05Z","lastTransitionTime":"2025-10-02T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.618073 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.618151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.618161 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.618184 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.618195 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:05Z","lastTransitionTime":"2025-10-02T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.721245 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.721295 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.721306 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.721324 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.721336 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:05Z","lastTransitionTime":"2025-10-02T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.825019 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.825103 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.825122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.825142 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.825155 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:05Z","lastTransitionTime":"2025-10-02T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.927347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.927456 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.927485 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.927523 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:05 crc kubenswrapper[4835]: I1002 10:57:05.927549 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:05Z","lastTransitionTime":"2025-10-02T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.031564 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.031615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.031629 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.031649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.031662 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.135662 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.135721 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.135730 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.135751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.135766 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.239000 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.239053 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.239064 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.239083 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.239097 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.252043 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.252067 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:06 crc kubenswrapper[4835]: E1002 10:57:06.252348 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.252470 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:06 crc kubenswrapper[4835]: E1002 10:57:06.252547 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:06 crc kubenswrapper[4835]: E1002 10:57:06.252818 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.278124 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.278215 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.278281 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.278315 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.278333 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: E1002 10:57:06.294423 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.299532 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.299611 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.299625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.299647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.299660 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: E1002 10:57:06.314300 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.319083 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.319134 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.319151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.319177 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.319195 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: E1002 10:57:06.332974 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.337543 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.337580 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.337594 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.337613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.337626 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: E1002 10:57:06.349996 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.354729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.354789 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.354802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.354824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.354838 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: E1002 10:57:06.367058 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:57:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9e30b685-d777-4f6c-84ea-b9aec6204c89\\\",\\\"systemUUID\\\":\\\"0e54c7ff-993a-4ab6-9817-7e5a943ad8d7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:57:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:57:06 crc kubenswrapper[4835]: E1002 10:57:06.367199 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.369239 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.369273 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.369284 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.369301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.369310 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.471630 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.471677 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.471731 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.471752 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.471761 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.574514 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.574583 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.574597 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.574613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.574626 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.677014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.677047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.677055 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.677069 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.677077 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.779205 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.779297 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.779311 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.779349 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.779363 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.881363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.881419 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.881460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.881479 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.881493 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.983479 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.983524 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.983535 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.983552 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:06 crc kubenswrapper[4835]: I1002 10:57:06.983563 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:06Z","lastTransitionTime":"2025-10-02T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.085690 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.085719 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.085727 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.085740 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.085749 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:07Z","lastTransitionTime":"2025-10-02T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.188834 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.188875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.188884 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.188898 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.188907 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:07Z","lastTransitionTime":"2025-10-02T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.251811 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:07 crc kubenswrapper[4835]: E1002 10:57:07.252162 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.291823 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.291885 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.291896 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.291919 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.291933 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:07Z","lastTransitionTime":"2025-10-02T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.394767 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.394817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.394828 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.394844 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.394856 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:07Z","lastTransitionTime":"2025-10-02T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.497697 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.497753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.497766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.497785 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.497801 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:07Z","lastTransitionTime":"2025-10-02T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.600421 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.600464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.600475 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.600494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.600507 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:07Z","lastTransitionTime":"2025-10-02T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.702394 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.702450 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.702460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.702477 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.702487 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:07Z","lastTransitionTime":"2025-10-02T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.804778 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.804826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.804838 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.804857 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.804869 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:07Z","lastTransitionTime":"2025-10-02T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.907625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.907685 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.907695 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.907711 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:07 crc kubenswrapper[4835]: I1002 10:57:07.907720 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:07Z","lastTransitionTime":"2025-10-02T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.010440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.010529 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.010571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.010587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.010599 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:08Z","lastTransitionTime":"2025-10-02T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.113389 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.113435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.113447 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.113465 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.113479 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:08Z","lastTransitionTime":"2025-10-02T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.216023 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.216069 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.216081 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.216098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.216110 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:08Z","lastTransitionTime":"2025-10-02T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.250771 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.250821 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.250771 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:08 crc kubenswrapper[4835]: E1002 10:57:08.250929 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:08 crc kubenswrapper[4835]: E1002 10:57:08.250998 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:08 crc kubenswrapper[4835]: E1002 10:57:08.251075 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.318793 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.318836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.318848 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.318865 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.318876 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:08Z","lastTransitionTime":"2025-10-02T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.421241 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.421282 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.421293 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.421308 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.421321 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:08Z","lastTransitionTime":"2025-10-02T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.523291 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.523337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.523348 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.523365 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.523379 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:08Z","lastTransitionTime":"2025-10-02T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.626809 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.626898 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.626931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.626959 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.626980 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:08Z","lastTransitionTime":"2025-10-02T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.729031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.729075 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.729087 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.729107 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.729123 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:08Z","lastTransitionTime":"2025-10-02T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.831496 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.831551 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.831565 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.831588 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.831603 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:08Z","lastTransitionTime":"2025-10-02T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.933800 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.933842 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.933854 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.933873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:08 crc kubenswrapper[4835]: I1002 10:57:08.933884 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:08Z","lastTransitionTime":"2025-10-02T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.036741 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.036791 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.036802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.036821 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.036835 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:09Z","lastTransitionTime":"2025-10-02T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.140047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.140111 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.140136 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.140179 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.140201 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:09Z","lastTransitionTime":"2025-10-02T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.243117 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.243201 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.243256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.243289 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.243314 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:09Z","lastTransitionTime":"2025-10-02T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.251653 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:09 crc kubenswrapper[4835]: E1002 10:57:09.252017 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.252238 4835 scope.go:117] "RemoveContainer" containerID="0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3" Oct 02 10:57:09 crc kubenswrapper[4835]: E1002 10:57:09.252389 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.347414 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.347493 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.347520 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.347557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.347582 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:09Z","lastTransitionTime":"2025-10-02T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.450633 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.450680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.450693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.450711 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.450721 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:09Z","lastTransitionTime":"2025-10-02T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.554176 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.554246 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.554256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.554273 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.554282 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:09Z","lastTransitionTime":"2025-10-02T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.657140 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.657212 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.657255 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.657351 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.657387 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:09Z","lastTransitionTime":"2025-10-02T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.760618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.760703 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.760716 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.760732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.760743 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:09Z","lastTransitionTime":"2025-10-02T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.863203 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.863258 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.863270 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.863285 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.863294 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:09Z","lastTransitionTime":"2025-10-02T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.966443 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.966506 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.966520 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.966542 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:09 crc kubenswrapper[4835]: I1002 10:57:09.966557 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:09Z","lastTransitionTime":"2025-10-02T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.069202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.069280 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.069324 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.069343 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.069357 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:10Z","lastTransitionTime":"2025-10-02T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.172035 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.172086 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.172098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.172115 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.172132 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:10Z","lastTransitionTime":"2025-10-02T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.251083 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.251154 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.251103 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:10 crc kubenswrapper[4835]: E1002 10:57:10.251264 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:10 crc kubenswrapper[4835]: E1002 10:57:10.251376 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:10 crc kubenswrapper[4835]: E1002 10:57:10.251480 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.274478 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.274535 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.274547 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.274573 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.274589 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:10Z","lastTransitionTime":"2025-10-02T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.377261 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.377311 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.377321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.377338 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.377349 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:10Z","lastTransitionTime":"2025-10-02T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.480886 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.480938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.480959 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.480986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.481006 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:10Z","lastTransitionTime":"2025-10-02T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.585200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.585288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.585302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.585328 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.585344 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:10Z","lastTransitionTime":"2025-10-02T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.687196 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.687269 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.687283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.687301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.687313 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:10Z","lastTransitionTime":"2025-10-02T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.789967 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.790010 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.790022 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.790042 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.790059 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:10Z","lastTransitionTime":"2025-10-02T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.893431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.893504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.893536 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.893587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.893615 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:10Z","lastTransitionTime":"2025-10-02T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.996729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.996776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.996842 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.996864 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:10 crc kubenswrapper[4835]: I1002 10:57:10.996876 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:10Z","lastTransitionTime":"2025-10-02T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.099796 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.099838 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.099849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.099867 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.099885 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:11Z","lastTransitionTime":"2025-10-02T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.203337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.203385 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.203400 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.203422 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.203434 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:11Z","lastTransitionTime":"2025-10-02T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.254258 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:11 crc kubenswrapper[4835]: E1002 10:57:11.254489 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.306774 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.306824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.306836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.306853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.306865 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:11Z","lastTransitionTime":"2025-10-02T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.410331 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.410414 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.410439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.410474 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.410504 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:11Z","lastTransitionTime":"2025-10-02T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.513720 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.513835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.513876 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.513914 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.513937 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:11Z","lastTransitionTime":"2025-10-02T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.618586 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.618635 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.618647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.618672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.618684 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:11Z","lastTransitionTime":"2025-10-02T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.722254 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.722313 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.722322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.722346 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.722359 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:11Z","lastTransitionTime":"2025-10-02T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.825290 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.825356 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.825371 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.825405 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.825421 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:11Z","lastTransitionTime":"2025-10-02T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.929633 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.929682 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.929701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.929722 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:11 crc kubenswrapper[4835]: I1002 10:57:11.929736 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:11Z","lastTransitionTime":"2025-10-02T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.032896 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.032955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.032973 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.032996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.033011 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:12Z","lastTransitionTime":"2025-10-02T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.136090 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.136148 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.136164 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.136187 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.136202 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:12Z","lastTransitionTime":"2025-10-02T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.238752 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.238786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.238793 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.238808 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.238816 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:12Z","lastTransitionTime":"2025-10-02T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.251078 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:12 crc kubenswrapper[4835]: E1002 10:57:12.251194 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.251078 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.251303 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:12 crc kubenswrapper[4835]: E1002 10:57:12.251411 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:12 crc kubenswrapper[4835]: E1002 10:57:12.251619 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.340926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.340961 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.340970 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.340985 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.340995 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:12Z","lastTransitionTime":"2025-10-02T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.443651 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.443719 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.443731 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.443754 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.443769 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:12Z","lastTransitionTime":"2025-10-02T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.546571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.546644 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.546668 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.546704 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.546726 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:12Z","lastTransitionTime":"2025-10-02T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.649512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.649559 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.649571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.649588 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.649600 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:12Z","lastTransitionTime":"2025-10-02T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.751990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.752048 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.752073 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.752105 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.752129 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:12Z","lastTransitionTime":"2025-10-02T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.854883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.854934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.854945 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.854965 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.854978 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:12Z","lastTransitionTime":"2025-10-02T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.958802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.959804 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.959873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.959916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:12 crc kubenswrapper[4835]: I1002 10:57:12.959937 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:12Z","lastTransitionTime":"2025-10-02T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.062247 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.062316 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.062326 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.062340 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.062350 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:13Z","lastTransitionTime":"2025-10-02T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.164574 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.164625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.164641 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.164661 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.164673 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:13Z","lastTransitionTime":"2025-10-02T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.251764 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:13 crc kubenswrapper[4835]: E1002 10:57:13.251885 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.266746 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.266798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.266809 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.266828 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.266840 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:13Z","lastTransitionTime":"2025-10-02T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.369587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.369630 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.369641 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.369668 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.369680 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:13Z","lastTransitionTime":"2025-10-02T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.471674 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.471713 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.471722 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.471738 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.471747 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:13Z","lastTransitionTime":"2025-10-02T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.574494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.574540 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.574550 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.574568 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.574582 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:13Z","lastTransitionTime":"2025-10-02T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.677006 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.677045 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.677056 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.677076 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.677088 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:13Z","lastTransitionTime":"2025-10-02T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.779811 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.779850 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.779861 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.779877 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.779886 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:13Z","lastTransitionTime":"2025-10-02T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.881687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.882051 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.882077 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.882100 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.882112 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:13Z","lastTransitionTime":"2025-10-02T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.985546 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.985597 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.985606 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.985623 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:13 crc kubenswrapper[4835]: I1002 10:57:13.985633 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:13Z","lastTransitionTime":"2025-10-02T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.089367 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.089417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.089426 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.089443 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.089452 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:14Z","lastTransitionTime":"2025-10-02T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.196819 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.197212 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.197292 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.197323 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.197343 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:14Z","lastTransitionTime":"2025-10-02T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.251005 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:14 crc kubenswrapper[4835]: E1002 10:57:14.251183 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.251314 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:14 crc kubenswrapper[4835]: E1002 10:57:14.251767 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.252103 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:14 crc kubenswrapper[4835]: E1002 10:57:14.252422 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.301450 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.301499 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.301512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.301533 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.301546 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:14Z","lastTransitionTime":"2025-10-02T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.305802 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podStartSLOduration=89.305789059 podStartE2EDuration="1m29.305789059s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:14.282757219 +0000 UTC m=+110.842664810" watchObservedRunningTime="2025-10-02 10:57:14.305789059 +0000 UTC m=+110.865696660" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.306028 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=55.306020896 podStartE2EDuration="55.306020896s" podCreationTimestamp="2025-10-02 10:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:14.305352926 +0000 UTC m=+110.865260517" watchObservedRunningTime="2025-10-02 10:57:14.306020896 +0000 UTC m=+110.865928487" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.319466 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=18.319450908 podStartE2EDuration="18.319450908s" podCreationTimestamp="2025-10-02 10:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:14.318917832 +0000 UTC m=+110.878825433" watchObservedRunningTime="2025-10-02 10:57:14.319450908 +0000 UTC m=+110.879358509" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.332252 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=86.332218591 podStartE2EDuration="1m26.332218591s" podCreationTimestamp="2025-10-02 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:14.331648134 +0000 UTC m=+110.891555735" watchObservedRunningTime="2025-10-02 10:57:14.332218591 +0000 UTC m=+110.892126192" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.372240 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nzxcq" podStartSLOduration=90.372195238 podStartE2EDuration="1m30.372195238s" podCreationTimestamp="2025-10-02 10:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:14.371617631 +0000 UTC m=+110.931525222" watchObservedRunningTime="2025-10-02 10:57:14.372195238 +0000 UTC m=+110.932102819" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.404142 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.404176 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.404185 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.404200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.404211 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:14Z","lastTransitionTime":"2025-10-02T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.410508 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=27.410489065 podStartE2EDuration="27.410489065s" podCreationTimestamp="2025-10-02 10:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:14.410254428 +0000 UTC m=+110.970162049" watchObservedRunningTime="2025-10-02 10:57:14.410489065 +0000 UTC m=+110.970396646" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.435600 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.435582617 podStartE2EDuration="1m29.435582617s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:14.434849405 +0000 UTC m=+110.994757006" watchObservedRunningTime="2025-10-02 10:57:14.435582617 +0000 UTC m=+110.995490198" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.489103 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bpzpk" podStartSLOduration=89.48908136 podStartE2EDuration="1m29.48908136s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:14.471425421 +0000 UTC m=+111.031333002" watchObservedRunningTime="2025-10-02 10:57:14.48908136 +0000 UTC m=+111.048988941" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.506369 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.506411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.506423 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.506441 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.506452 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:14Z","lastTransitionTime":"2025-10-02T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.514733 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gm7l9" podStartSLOduration=89.514713908 podStartE2EDuration="1m29.514713908s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:14.514243514 +0000 UTC m=+111.074151095" watchObservedRunningTime="2025-10-02 10:57:14.514713908 +0000 UTC m=+111.074621509" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.529492 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2tw4v" podStartSLOduration=89.52947419 podStartE2EDuration="1m29.52947419s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:14.528727718 +0000 UTC m=+111.088635299" watchObservedRunningTime="2025-10-02 10:57:14.52947419 +0000 UTC m=+111.089381771" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.567340 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bjtqm" podStartSLOduration=89.567308744 podStartE2EDuration="1m29.567308744s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:14.543337746 +0000 UTC m=+111.103245337" watchObservedRunningTime="2025-10-02 10:57:14.567308744 +0000 UTC m=+111.127216315" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.608700 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.608739 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.608751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.608767 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.608781 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:14Z","lastTransitionTime":"2025-10-02T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.710608 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.710659 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.710671 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.710690 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.710701 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:14Z","lastTransitionTime":"2025-10-02T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.813315 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.813358 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.813370 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.813388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.813402 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:14Z","lastTransitionTime":"2025-10-02T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.914944 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.914992 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.915004 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.915021 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:14 crc kubenswrapper[4835]: I1002 10:57:14.915035 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:14Z","lastTransitionTime":"2025-10-02T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.017307 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.017355 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.017367 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.017384 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.017393 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:15Z","lastTransitionTime":"2025-10-02T10:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.120842 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.120898 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.120911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.120933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.120945 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:15Z","lastTransitionTime":"2025-10-02T10:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.224141 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.224204 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.224251 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.224282 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.224301 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:15Z","lastTransitionTime":"2025-10-02T10:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.251348 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:15 crc kubenswrapper[4835]: E1002 10:57:15.251513 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.327397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.327462 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.327482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.327507 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.327525 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:15Z","lastTransitionTime":"2025-10-02T10:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.431557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.431627 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.431649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.431678 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.431700 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:15Z","lastTransitionTime":"2025-10-02T10:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.535444 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.535487 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.535498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.535517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.535530 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:15Z","lastTransitionTime":"2025-10-02T10:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.640070 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.640154 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.640173 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.640204 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.640284 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:15Z","lastTransitionTime":"2025-10-02T10:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.743180 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.743213 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.743233 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.743247 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.743256 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:15Z","lastTransitionTime":"2025-10-02T10:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.845826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.845891 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.845911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.845938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.845961 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:15Z","lastTransitionTime":"2025-10-02T10:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.947676 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.947720 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.947731 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.947748 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:15 crc kubenswrapper[4835]: I1002 10:57:15.947758 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:15Z","lastTransitionTime":"2025-10-02T10:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.050178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.050240 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.050252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.050269 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.050278 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:16Z","lastTransitionTime":"2025-10-02T10:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.152629 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.152673 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.152686 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.152704 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.152717 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:16Z","lastTransitionTime":"2025-10-02T10:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.251685 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.251742 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.251901 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:16 crc kubenswrapper[4835]: E1002 10:57:16.252038 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:16 crc kubenswrapper[4835]: E1002 10:57:16.252191 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:16 crc kubenswrapper[4835]: E1002 10:57:16.252556 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.254998 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.255039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.255051 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.255069 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.255082 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:16Z","lastTransitionTime":"2025-10-02T10:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.358143 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.358208 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.358254 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.358281 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.358299 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:16Z","lastTransitionTime":"2025-10-02T10:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.424620 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.424677 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.424701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.424730 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.424750 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:57:16Z","lastTransitionTime":"2025-10-02T10:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.493338 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5"] Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.494099 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.496400 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.497058 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.497373 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.498780 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.614393 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e975dea-93bf-4a8d-b300-6903eee09ef8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.614471 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0e975dea-93bf-4a8d-b300-6903eee09ef8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.614502 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e975dea-93bf-4a8d-b300-6903eee09ef8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.614623 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0e975dea-93bf-4a8d-b300-6903eee09ef8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.614675 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e975dea-93bf-4a8d-b300-6903eee09ef8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.716110 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e975dea-93bf-4a8d-b300-6903eee09ef8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.716198 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e975dea-93bf-4a8d-b300-6903eee09ef8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.716259 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0e975dea-93bf-4a8d-b300-6903eee09ef8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.716280 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e975dea-93bf-4a8d-b300-6903eee09ef8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.716307 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0e975dea-93bf-4a8d-b300-6903eee09ef8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.716409 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0e975dea-93bf-4a8d-b300-6903eee09ef8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.716441 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0e975dea-93bf-4a8d-b300-6903eee09ef8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.718901 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e975dea-93bf-4a8d-b300-6903eee09ef8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.730654 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e975dea-93bf-4a8d-b300-6903eee09ef8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.737796 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e975dea-93bf-4a8d-b300-6903eee09ef8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lklb5\" (UID: \"0e975dea-93bf-4a8d-b300-6903eee09ef8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.864267 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" Oct 02 10:57:16 crc kubenswrapper[4835]: I1002 10:57:16.942935 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" event={"ID":"0e975dea-93bf-4a8d-b300-6903eee09ef8","Type":"ContainerStarted","Data":"a7f39742bd39a0f11d1c2786e439effd024c998827f3f841e33f357a83980091"} Oct 02 10:57:17 crc kubenswrapper[4835]: I1002 10:57:17.251181 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:17 crc kubenswrapper[4835]: E1002 10:57:17.251882 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:17 crc kubenswrapper[4835]: I1002 10:57:17.947244 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" event={"ID":"0e975dea-93bf-4a8d-b300-6903eee09ef8","Type":"ContainerStarted","Data":"49f62a00ccdad1bc65fe1d80224c9eee666fb5c2c29446a89ba9fbe662a34406"} Oct 02 10:57:17 crc kubenswrapper[4835]: I1002 10:57:17.963431 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lklb5" podStartSLOduration=92.963413297 podStartE2EDuration="1m32.963413297s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:17.96217739 +0000 UTC m=+114.522084971" watchObservedRunningTime="2025-10-02 10:57:17.963413297 +0000 UTC m=+114.523320878" Oct 02 10:57:18 crc kubenswrapper[4835]: I1002 10:57:18.251943 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:18 crc kubenswrapper[4835]: I1002 10:57:18.251992 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:18 crc kubenswrapper[4835]: I1002 10:57:18.252054 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:18 crc kubenswrapper[4835]: E1002 10:57:18.252154 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:18 crc kubenswrapper[4835]: E1002 10:57:18.252284 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:18 crc kubenswrapper[4835]: E1002 10:57:18.252382 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:19 crc kubenswrapper[4835]: I1002 10:57:19.251280 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:19 crc kubenswrapper[4835]: E1002 10:57:19.251502 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:20 crc kubenswrapper[4835]: I1002 10:57:20.251609 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:20 crc kubenswrapper[4835]: I1002 10:57:20.251719 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:20 crc kubenswrapper[4835]: I1002 10:57:20.251752 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:20 crc kubenswrapper[4835]: E1002 10:57:20.251920 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:20 crc kubenswrapper[4835]: E1002 10:57:20.252287 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:20 crc kubenswrapper[4835]: E1002 10:57:20.252116 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:20 crc kubenswrapper[4835]: I1002 10:57:20.959365 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2tw4v_cea2edfd-8b9c-44be-be9a-d2feb410da71/kube-multus/1.log" Oct 02 10:57:20 crc kubenswrapper[4835]: I1002 10:57:20.960090 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2tw4v_cea2edfd-8b9c-44be-be9a-d2feb410da71/kube-multus/0.log" Oct 02 10:57:20 crc kubenswrapper[4835]: I1002 10:57:20.960187 4835 generic.go:334] "Generic (PLEG): container finished" podID="cea2edfd-8b9c-44be-be9a-d2feb410da71" containerID="aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561" exitCode=1 Oct 02 10:57:20 crc kubenswrapper[4835]: I1002 10:57:20.960285 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2tw4v" event={"ID":"cea2edfd-8b9c-44be-be9a-d2feb410da71","Type":"ContainerDied","Data":"aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561"} Oct 02 10:57:20 crc kubenswrapper[4835]: I1002 10:57:20.960357 4835 scope.go:117] "RemoveContainer" containerID="48a31bf255dec73ed2199f10e94050b901e2c5787935e2cbe7ae3bbfa1b85c2b" Oct 02 10:57:20 crc kubenswrapper[4835]: I1002 10:57:20.960826 4835 scope.go:117] "RemoveContainer" containerID="aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561" Oct 02 10:57:20 crc kubenswrapper[4835]: E1002 10:57:20.961205 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-2tw4v_openshift-multus(cea2edfd-8b9c-44be-be9a-d2feb410da71)\"" pod="openshift-multus/multus-2tw4v" podUID="cea2edfd-8b9c-44be-be9a-d2feb410da71" Oct 02 10:57:21 crc kubenswrapper[4835]: I1002 10:57:21.251394 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:21 crc kubenswrapper[4835]: E1002 10:57:21.251636 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:21 crc kubenswrapper[4835]: I1002 10:57:21.973856 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2tw4v_cea2edfd-8b9c-44be-be9a-d2feb410da71/kube-multus/1.log" Oct 02 10:57:22 crc kubenswrapper[4835]: I1002 10:57:22.250883 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:22 crc kubenswrapper[4835]: I1002 10:57:22.250939 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:22 crc kubenswrapper[4835]: E1002 10:57:22.252667 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:22 crc kubenswrapper[4835]: I1002 10:57:22.251013 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:22 crc kubenswrapper[4835]: E1002 10:57:22.252758 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:22 crc kubenswrapper[4835]: E1002 10:57:22.252967 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:23 crc kubenswrapper[4835]: I1002 10:57:23.251702 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:23 crc kubenswrapper[4835]: E1002 10:57:23.251856 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:23 crc kubenswrapper[4835]: I1002 10:57:23.252603 4835 scope.go:117] "RemoveContainer" containerID="0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3" Oct 02 10:57:23 crc kubenswrapper[4835]: E1002 10:57:23.252777 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79zgl_openshift-ovn-kubernetes(e1c2dc14-32fa-43fc-ae87-11d02eb3400a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" Oct 02 10:57:24 crc kubenswrapper[4835]: E1002 10:57:24.218126 4835 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 02 10:57:24 crc kubenswrapper[4835]: I1002 10:57:24.251663 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:24 crc kubenswrapper[4835]: I1002 10:57:24.251695 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:24 crc kubenswrapper[4835]: I1002 10:57:24.251668 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:24 crc kubenswrapper[4835]: E1002 10:57:24.251813 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:24 crc kubenswrapper[4835]: E1002 10:57:24.251871 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:24 crc kubenswrapper[4835]: E1002 10:57:24.251923 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:24 crc kubenswrapper[4835]: E1002 10:57:24.384064 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 10:57:25 crc kubenswrapper[4835]: I1002 10:57:25.250937 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:25 crc kubenswrapper[4835]: E1002 10:57:25.251116 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:26 crc kubenswrapper[4835]: I1002 10:57:26.251976 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:26 crc kubenswrapper[4835]: E1002 10:57:26.252320 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:26 crc kubenswrapper[4835]: I1002 10:57:26.252751 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:26 crc kubenswrapper[4835]: I1002 10:57:26.252948 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:26 crc kubenswrapper[4835]: E1002 10:57:26.252904 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:26 crc kubenswrapper[4835]: E1002 10:57:26.253295 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:27 crc kubenswrapper[4835]: I1002 10:57:27.251066 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:27 crc kubenswrapper[4835]: E1002 10:57:27.251278 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:28 crc kubenswrapper[4835]: I1002 10:57:28.251677 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:28 crc kubenswrapper[4835]: I1002 10:57:28.251752 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:28 crc kubenswrapper[4835]: E1002 10:57:28.251864 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:28 crc kubenswrapper[4835]: I1002 10:57:28.251950 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:28 crc kubenswrapper[4835]: E1002 10:57:28.252172 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:28 crc kubenswrapper[4835]: E1002 10:57:28.252258 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:29 crc kubenswrapper[4835]: I1002 10:57:29.251504 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:29 crc kubenswrapper[4835]: E1002 10:57:29.251643 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:29 crc kubenswrapper[4835]: E1002 10:57:29.384756 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 10:57:30 crc kubenswrapper[4835]: I1002 10:57:30.251587 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:30 crc kubenswrapper[4835]: I1002 10:57:30.251605 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:30 crc kubenswrapper[4835]: I1002 10:57:30.251598 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:30 crc kubenswrapper[4835]: E1002 10:57:30.251841 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:30 crc kubenswrapper[4835]: E1002 10:57:30.251937 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:30 crc kubenswrapper[4835]: E1002 10:57:30.252139 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:31 crc kubenswrapper[4835]: I1002 10:57:31.251173 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:31 crc kubenswrapper[4835]: E1002 10:57:31.251378 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:32 crc kubenswrapper[4835]: I1002 10:57:32.251505 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:32 crc kubenswrapper[4835]: E1002 10:57:32.251679 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:32 crc kubenswrapper[4835]: I1002 10:57:32.251916 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:32 crc kubenswrapper[4835]: I1002 10:57:32.251926 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:32 crc kubenswrapper[4835]: E1002 10:57:32.252031 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:32 crc kubenswrapper[4835]: E1002 10:57:32.252336 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:33 crc kubenswrapper[4835]: I1002 10:57:33.250947 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:33 crc kubenswrapper[4835]: E1002 10:57:33.251194 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:34 crc kubenswrapper[4835]: I1002 10:57:34.251063 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:34 crc kubenswrapper[4835]: I1002 10:57:34.251196 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:34 crc kubenswrapper[4835]: E1002 10:57:34.253423 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:34 crc kubenswrapper[4835]: I1002 10:57:34.253448 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:34 crc kubenswrapper[4835]: E1002 10:57:34.253765 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:34 crc kubenswrapper[4835]: E1002 10:57:34.254007 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:34 crc kubenswrapper[4835]: E1002 10:57:34.385337 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 10:57:35 crc kubenswrapper[4835]: I1002 10:57:35.251304 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:35 crc kubenswrapper[4835]: E1002 10:57:35.251547 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:36 crc kubenswrapper[4835]: I1002 10:57:36.252538 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:36 crc kubenswrapper[4835]: I1002 10:57:36.252558 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:36 crc kubenswrapper[4835]: E1002 10:57:36.253062 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:36 crc kubenswrapper[4835]: I1002 10:57:36.252701 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:36 crc kubenswrapper[4835]: E1002 10:57:36.253301 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:36 crc kubenswrapper[4835]: I1002 10:57:36.253509 4835 scope.go:117] "RemoveContainer" containerID="0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3" Oct 02 10:57:36 crc kubenswrapper[4835]: E1002 10:57:36.253530 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:36 crc kubenswrapper[4835]: I1002 10:57:36.253661 4835 scope.go:117] "RemoveContainer" containerID="aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561" Oct 02 10:57:37 crc kubenswrapper[4835]: I1002 10:57:37.031755 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2tw4v_cea2edfd-8b9c-44be-be9a-d2feb410da71/kube-multus/1.log" Oct 02 10:57:37 crc kubenswrapper[4835]: I1002 10:57:37.031841 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2tw4v" event={"ID":"cea2edfd-8b9c-44be-be9a-d2feb410da71","Type":"ContainerStarted","Data":"0a6af3c25fd2b9444b9cb65bb4553e343f1f3d4362cac711a5c1a2252386b09b"} Oct 02 10:57:37 crc kubenswrapper[4835]: I1002 10:57:37.035203 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/3.log" Oct 02 10:57:37 crc kubenswrapper[4835]: I1002 10:57:37.039024 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerStarted","Data":"d9744ccc8b6a5a64b87243892ec48b92624b70e6137acbb3b4a1839480650bcb"} Oct 02 10:57:37 crc kubenswrapper[4835]: I1002 10:57:37.039501 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:57:37 crc kubenswrapper[4835]: I1002 10:57:37.112688 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podStartSLOduration=112.112652384 podStartE2EDuration="1m52.112652384s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:37.104760908 +0000 UTC m=+133.664668499" watchObservedRunningTime="2025-10-02 10:57:37.112652384 +0000 UTC m=+133.672559985" Oct 02 10:57:37 crc kubenswrapper[4835]: I1002 10:57:37.114439 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5j5j6"] Oct 02 10:57:37 crc kubenswrapper[4835]: I1002 10:57:37.114635 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:37 crc kubenswrapper[4835]: E1002 10:57:37.114796 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:38 crc kubenswrapper[4835]: I1002 10:57:38.250995 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:38 crc kubenswrapper[4835]: E1002 10:57:38.251647 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:38 crc kubenswrapper[4835]: I1002 10:57:38.251005 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:38 crc kubenswrapper[4835]: I1002 10:57:38.251026 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:38 crc kubenswrapper[4835]: E1002 10:57:38.252008 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:38 crc kubenswrapper[4835]: E1002 10:57:38.252319 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:39 crc kubenswrapper[4835]: I1002 10:57:39.251528 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:39 crc kubenswrapper[4835]: E1002 10:57:39.251805 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:39 crc kubenswrapper[4835]: E1002 10:57:39.386427 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 10:57:40 crc kubenswrapper[4835]: I1002 10:57:40.251422 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:40 crc kubenswrapper[4835]: E1002 10:57:40.251616 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:40 crc kubenswrapper[4835]: I1002 10:57:40.251923 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:40 crc kubenswrapper[4835]: E1002 10:57:40.252015 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:40 crc kubenswrapper[4835]: I1002 10:57:40.252334 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:40 crc kubenswrapper[4835]: E1002 10:57:40.252444 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:41 crc kubenswrapper[4835]: I1002 10:57:41.250804 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:41 crc kubenswrapper[4835]: E1002 10:57:41.251004 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:42 crc kubenswrapper[4835]: I1002 10:57:42.251154 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:42 crc kubenswrapper[4835]: I1002 10:57:42.251198 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:42 crc kubenswrapper[4835]: I1002 10:57:42.251165 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:42 crc kubenswrapper[4835]: E1002 10:57:42.251378 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:42 crc kubenswrapper[4835]: E1002 10:57:42.251758 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:42 crc kubenswrapper[4835]: E1002 10:57:42.251844 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:43 crc kubenswrapper[4835]: I1002 10:57:43.251033 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:43 crc kubenswrapper[4835]: E1002 10:57:43.251288 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j5j6" podUID="7fddaac1-5041-411a-8aed-e7337c06713f" Oct 02 10:57:44 crc kubenswrapper[4835]: I1002 10:57:44.251633 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:44 crc kubenswrapper[4835]: I1002 10:57:44.251735 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:44 crc kubenswrapper[4835]: I1002 10:57:44.251731 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:44 crc kubenswrapper[4835]: E1002 10:57:44.254506 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:57:44 crc kubenswrapper[4835]: E1002 10:57:44.254592 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:57:44 crc kubenswrapper[4835]: E1002 10:57:44.254770 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:57:45 crc kubenswrapper[4835]: I1002 10:57:45.250962 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:57:45 crc kubenswrapper[4835]: I1002 10:57:45.254855 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 02 10:57:45 crc kubenswrapper[4835]: I1002 10:57:45.256270 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 02 10:57:46 crc kubenswrapper[4835]: I1002 10:57:46.250867 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:46 crc kubenswrapper[4835]: I1002 10:57:46.250916 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:46 crc kubenswrapper[4835]: I1002 10:57:46.251542 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:46 crc kubenswrapper[4835]: I1002 10:57:46.254650 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 02 10:57:46 crc kubenswrapper[4835]: I1002 10:57:46.261713 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 02 10:57:46 crc kubenswrapper[4835]: I1002 10:57:46.261732 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 02 10:57:46 crc kubenswrapper[4835]: I1002 10:57:46.262362 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.246878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.300946 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gckdg"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.301572 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.302826 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.303826 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.304700 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rzxmz"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.305301 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.306503 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zl6kd"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.307129 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.308374 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.308869 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.312482 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lz6t7"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.313442 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.317796 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zshpt"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.318305 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.318603 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.319076 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.322167 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lm4nm"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.322840 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.323966 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.324975 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.360753 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.364967 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.365216 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.365264 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.369944 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.371366 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.377924 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.385588 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.386243 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.386301 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.386433 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.386708 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.386902 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.386938 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.387115 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.387186 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.387426 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.387427 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.387700 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.387797 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.388063 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.406415 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.408828 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.410600 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.410986 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.411403 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.411858 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.412435 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.416072 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.413960 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.414186 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.414304 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.414817 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.415128 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.415262 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.415298 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.415437 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.415715 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.415969 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.415970 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.416409 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.416467 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.416758 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.444474 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.444655 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.444748 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.444841 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.444918 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.445006 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.445106 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.445208 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.445386 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.445432 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.445473 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.445716 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.445834 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446026 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446159 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khf59\" (UniqueName: \"kubernetes.io/projected/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-kube-api-access-khf59\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446197 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-config\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446240 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a021eefd-fd0f-458d-8629-d40a02daa592-etcd-client\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446266 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b5rc\" (UniqueName: \"kubernetes.io/projected/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-kube-api-access-8b5rc\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446292 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd86268-0ac9-4583-8285-dc07ea50cc28-serving-cert\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446313 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446338 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd86268-0ac9-4583-8285-dc07ea50cc28-config\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446367 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kglgp\" (UniqueName: \"kubernetes.io/projected/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-kube-api-access-kglgp\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446390 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-oauth-config\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446414 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97mbg\" (UniqueName: \"kubernetes.io/projected/26659604-60d4-488b-a38b-52fedf8d098d-kube-api-access-97mbg\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446553 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446586 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-config\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446750 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3318f2b-f771-4c19-ac42-46af048219f9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tx27b\" (UID: \"b3318f2b-f771-4c19-ac42-46af048219f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446834 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-oauth-serving-cert\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446913 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26659604-60d4-488b-a38b-52fedf8d098d-config\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446937 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/26659604-60d4-488b-a38b-52fedf8d098d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.446958 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447093 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-trusted-ca-bundle\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447125 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtvdz\" (UniqueName: \"kubernetes.io/projected/7cd86268-0ac9-4583-8285-dc07ea50cc28-kube-api-access-dtvdz\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447151 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-serving-cert\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447192 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447303 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/26659604-60d4-488b-a38b-52fedf8d098d-images\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447341 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4dwt\" (UniqueName: \"kubernetes.io/projected/b3318f2b-f771-4c19-ac42-46af048219f9-kube-api-access-z4dwt\") pod \"openshift-apiserver-operator-796bbdcf4f-tx27b\" (UID: \"b3318f2b-f771-4c19-ac42-46af048219f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447368 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447395 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447422 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447452 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cd86268-0ac9-4583-8285-dc07ea50cc28-trusted-ca\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447481 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3318f2b-f771-4c19-ac42-46af048219f9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tx27b\" (UID: \"b3318f2b-f771-4c19-ac42-46af048219f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447507 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447521 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447537 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447564 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a021eefd-fd0f-458d-8629-d40a02daa592-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447590 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-policies\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447622 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-dir\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447652 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-client-ca\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447677 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447702 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b41c8fbf-c78d-4b6c-8241-a4bbd2654291-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gckdg\" (UID: \"b41c8fbf-c78d-4b6c-8241-a4bbd2654291\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447728 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z56zz\" (UniqueName: \"kubernetes.io/projected/b41c8fbf-c78d-4b6c-8241-a4bbd2654291-kube-api-access-z56zz\") pod \"openshift-config-operator-7777fb866f-gckdg\" (UID: \"b41c8fbf-c78d-4b6c-8241-a4bbd2654291\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447756 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a021eefd-fd0f-458d-8629-d40a02daa592-serving-cert\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447780 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447818 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447855 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41c8fbf-c78d-4b6c-8241-a4bbd2654291-serving-cert\") pod \"openshift-config-operator-7777fb866f-gckdg\" (UID: \"b41c8fbf-c78d-4b6c-8241-a4bbd2654291\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447883 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a021eefd-fd0f-458d-8629-d40a02daa592-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447907 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447937 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73f4901e-a2c3-474d-8d52-972b775c2017-serving-cert\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.447964 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a021eefd-fd0f-458d-8629-d40a02daa592-encryption-config\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.448027 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.448103 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc7cn\" (UniqueName: \"kubernetes.io/projected/a021eefd-fd0f-458d-8629-d40a02daa592-kube-api-access-hc7cn\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.448130 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/073951ec-7b3a-4f78-89ee-771308246966-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dvpwv\" (UID: \"073951ec-7b3a-4f78-89ee-771308246966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.448174 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdf55\" (UniqueName: \"kubernetes.io/projected/73f4901e-a2c3-474d-8d52-972b775c2017-kube-api-access-kdf55\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.448192 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.448204 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.448338 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ffrj5"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.448731 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-99q5b"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.449253 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-6xvln"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.449601 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6xvln" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.449651 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.449735 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwkl9\" (UniqueName: \"kubernetes.io/projected/073951ec-7b3a-4f78-89ee-771308246966-kube-api-access-gwkl9\") pod \"cluster-samples-operator-665b6dd947-dvpwv\" (UID: \"073951ec-7b3a-4f78-89ee-771308246966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.449815 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a021eefd-fd0f-458d-8629-d40a02daa592-audit-policies\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.449861 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.450074 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.450162 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a021eefd-fd0f-458d-8629-d40a02daa592-audit-dir\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.450199 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.450298 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-service-ca\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.451560 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.455541 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.473630 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.474299 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.475558 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.475614 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.476711 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f29zd"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.477154 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.477540 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.477993 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.478132 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.478396 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.478727 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.479040 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.514066 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.524688 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.533495 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.533570 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.533578 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.533757 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.534136 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.534524 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.534593 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.534769 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.534823 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.534958 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.535016 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.535094 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.537285 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.537591 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.542004 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.554524 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.554565 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.555092 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.555323 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-config\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.555365 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khf59\" (UniqueName: \"kubernetes.io/projected/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-kube-api-access-khf59\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.555390 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a021eefd-fd0f-458d-8629-d40a02daa592-etcd-client\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.555407 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b5rc\" (UniqueName: \"kubernetes.io/projected/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-kube-api-access-8b5rc\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.555426 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd86268-0ac9-4583-8285-dc07ea50cc28-serving-cert\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.555442 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd86268-0ac9-4583-8285-dc07ea50cc28-config\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.555460 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.555480 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556165 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556312 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556423 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.555480 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-oauth-config\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556570 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556617 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97mbg\" (UniqueName: \"kubernetes.io/projected/26659604-60d4-488b-a38b-52fedf8d098d-kube-api-access-97mbg\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556653 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kglgp\" (UniqueName: \"kubernetes.io/projected/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-kube-api-access-kglgp\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556682 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556711 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3318f2b-f771-4c19-ac42-46af048219f9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tx27b\" (UID: \"b3318f2b-f771-4c19-ac42-46af048219f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556736 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-config\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556764 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-trusted-ca-bundle\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556786 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-oauth-serving-cert\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556805 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26659604-60d4-488b-a38b-52fedf8d098d-config\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556824 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/26659604-60d4-488b-a38b-52fedf8d098d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556847 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556875 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtvdz\" (UniqueName: \"kubernetes.io/projected/7cd86268-0ac9-4583-8285-dc07ea50cc28-kube-api-access-dtvdz\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556893 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-serving-cert\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556914 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556959 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4dwt\" (UniqueName: \"kubernetes.io/projected/b3318f2b-f771-4c19-ac42-46af048219f9-kube-api-access-z4dwt\") pod \"openshift-apiserver-operator-796bbdcf4f-tx27b\" (UID: \"b3318f2b-f771-4c19-ac42-46af048219f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556958 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.558949 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.558949 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-config\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.559208 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd86268-0ac9-4583-8285-dc07ea50cc28-config\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.559255 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.559460 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.559482 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.559504 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.556984 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.559951 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.562820 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.563257 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.563511 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.563764 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.564158 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.564436 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.564611 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.565562 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.566066 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-vpsrn"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.566482 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rzxmz"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.566516 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.566626 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.566813 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.566878 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.567150 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.567269 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.567633 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.573149 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/26659604-60d4-488b-a38b-52fedf8d098d-images\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.573212 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.573261 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.573283 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cd86268-0ac9-4583-8285-dc07ea50cc28-trusted-ca\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.573311 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3318f2b-f771-4c19-ac42-46af048219f9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tx27b\" (UID: \"b3318f2b-f771-4c19-ac42-46af048219f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.573422 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.573451 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.573470 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a021eefd-fd0f-458d-8629-d40a02daa592-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.573491 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-policies\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.573509 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-dir\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.573528 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-client-ca\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.574126 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3318f2b-f771-4c19-ac42-46af048219f9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tx27b\" (UID: \"b3318f2b-f771-4c19-ac42-46af048219f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.574684 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-oauth-config\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.574778 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.575424 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-oauth-serving-cert\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.576330 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.576437 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-trusted-ca-bundle\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.577162 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.577171 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26659604-60d4-488b-a38b-52fedf8d098d-config\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.577399 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-policies\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.578098 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/26659604-60d4-488b-a38b-52fedf8d098d-images\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.578121 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.578170 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.578555 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.578758 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-client-ca\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.579135 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a021eefd-fd0f-458d-8629-d40a02daa592-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.579440 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41c8fbf-c78d-4b6c-8241-a4bbd2654291-serving-cert\") pod \"openshift-config-operator-7777fb866f-gckdg\" (UID: \"b41c8fbf-c78d-4b6c-8241-a4bbd2654291\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.579455 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-dir\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.579470 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cd86268-0ac9-4583-8285-dc07ea50cc28-trusted-ca\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.580311 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-grwx8"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.587171 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd86268-0ac9-4583-8285-dc07ea50cc28-serving-cert\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.587310 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b41c8fbf-c78d-4b6c-8241-a4bbd2654291-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gckdg\" (UID: \"b41c8fbf-c78d-4b6c-8241-a4bbd2654291\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.587376 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z56zz\" (UniqueName: \"kubernetes.io/projected/b41c8fbf-c78d-4b6c-8241-a4bbd2654291-kube-api-access-z56zz\") pod \"openshift-config-operator-7777fb866f-gckdg\" (UID: \"b41c8fbf-c78d-4b6c-8241-a4bbd2654291\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.587401 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a021eefd-fd0f-458d-8629-d40a02daa592-serving-cert\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.587455 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.588034 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.588178 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a021eefd-fd0f-458d-8629-d40a02daa592-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.581800 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.588253 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73f4901e-a2c3-474d-8d52-972b775c2017-serving-cert\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.588273 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.588039 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b41c8fbf-c78d-4b6c-8241-a4bbd2654291-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gckdg\" (UID: \"b41c8fbf-c78d-4b6c-8241-a4bbd2654291\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.581883 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.581967 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.588194 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.588993 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.589362 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.589447 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.589613 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.589682 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.590381 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a021eefd-fd0f-458d-8629-d40a02daa592-encryption-config\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.590446 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.590708 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a021eefd-fd0f-458d-8629-d40a02daa592-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.590794 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.590991 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdf55\" (UniqueName: \"kubernetes.io/projected/73f4901e-a2c3-474d-8d52-972b775c2017-kube-api-access-kdf55\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.591094 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc7cn\" (UniqueName: \"kubernetes.io/projected/a021eefd-fd0f-458d-8629-d40a02daa592-kube-api-access-hc7cn\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.590753 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gckdg"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.591259 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.596587 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zl6kd"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.596706 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.596786 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fbjn7"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.597356 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.597554 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-serving-cert\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.591057 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.593757 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a021eefd-fd0f-458d-8629-d40a02daa592-encryption-config\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.598494 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.594063 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a021eefd-fd0f-458d-8629-d40a02daa592-etcd-client\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.599155 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/073951ec-7b3a-4f78-89ee-771308246966-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dvpwv\" (UID: \"073951ec-7b3a-4f78-89ee-771308246966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.599309 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.599483 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.600192 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-config\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.600335 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.601299 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.602419 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwkl9\" (UniqueName: \"kubernetes.io/projected/073951ec-7b3a-4f78-89ee-771308246966-kube-api-access-gwkl9\") pod \"cluster-samples-operator-665b6dd947-dvpwv\" (UID: \"073951ec-7b3a-4f78-89ee-771308246966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.602808 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a021eefd-fd0f-458d-8629-d40a02daa592-audit-policies\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.602866 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a021eefd-fd0f-458d-8629-d40a02daa592-audit-dir\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.602897 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.602925 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-service-ca\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.603952 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-service-ca\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.604470 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a021eefd-fd0f-458d-8629-d40a02daa592-audit-dir\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.604542 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a021eefd-fd0f-458d-8629-d40a02daa592-audit-policies\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.605272 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3318f2b-f771-4c19-ac42-46af048219f9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tx27b\" (UID: \"b3318f2b-f771-4c19-ac42-46af048219f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.609563 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/073951ec-7b3a-4f78-89ee-771308246966-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dvpwv\" (UID: \"073951ec-7b3a-4f78-89ee-771308246966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.609936 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.615133 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lz6t7"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.615188 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lbswp"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.615662 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.616247 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.621039 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73f4901e-a2c3-474d-8d52-972b775c2017-serving-cert\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.625645 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-l2rvj"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.626529 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.626967 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.627457 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.628088 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.628120 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/26659604-60d4-488b-a38b-52fedf8d098d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.628135 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.628370 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.628685 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a021eefd-fd0f-458d-8629-d40a02daa592-serving-cert\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.628917 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.629021 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.629246 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41c8fbf-c78d-4b6c-8241-a4bbd2654291-serving-cert\") pod \"openshift-config-operator-7777fb866f-gckdg\" (UID: \"b41c8fbf-c78d-4b6c-8241-a4bbd2654291\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.629265 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.642471 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97mbg\" (UniqueName: \"kubernetes.io/projected/26659604-60d4-488b-a38b-52fedf8d098d-kube-api-access-97mbg\") pod \"machine-api-operator-5694c8668f-zshpt\" (UID: \"26659604-60d4-488b-a38b-52fedf8d098d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.629355 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.629321 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.629445 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.629365 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.644789 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b5rc\" (UniqueName: \"kubernetes.io/projected/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-kube-api-access-8b5rc\") pod \"oauth-openshift-558db77b4-lm4nm\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.645290 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.646420 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.646567 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.646638 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.648853 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.654660 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kglgp\" (UniqueName: \"kubernetes.io/projected/32bb0a90-5866-4c3b-b96d-62eb9d4e04ca-kube-api-access-kglgp\") pod \"cluster-image-registry-operator-dc59b4c8b-tw5z2\" (UID: \"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.655042 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.656690 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.661512 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ttsbw"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.663072 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.663510 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.666561 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.667416 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.667752 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p8g6s"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.668407 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.668655 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.669280 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ffrj5"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.670301 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lm4nm"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.671666 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.673027 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zshpt"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.674589 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.676172 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.676416 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.679339 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.680769 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.681755 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.683319 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f29zd"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.683506 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.684117 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.685502 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2vmv5"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.686151 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-99q5b"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.686278 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2vmv5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.688824 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.689988 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-grwx8"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.691987 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.692782 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6xvln"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.694096 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.695759 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.696524 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.699294 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.700790 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lbswp"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.702745 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hvdzw"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.703408 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hvdzw" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.703912 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a39aeae5-118c-4f69-8908-e35f2b1615b8-auth-proxy-config\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.703947 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35729d8f-7bae-44f7-be5c-054cd97fb39c-service-ca-bundle\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704003 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lz27\" (UniqueName: \"kubernetes.io/projected/e81c3107-368c-40ff-b16f-4a97cffa7d13-kube-api-access-8lz27\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.703998 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704095 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39aeae5-118c-4f69-8908-e35f2b1615b8-config\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704234 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn6bj\" (UniqueName: \"kubernetes.io/projected/8192291d-afb3-4169-a5e3-177080eacd7d-kube-api-access-kn6bj\") pod \"migrator-59844c95c7-6vp7k\" (UID: \"8192291d-afb3-4169-a5e3-177080eacd7d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704381 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcqhk\" (UniqueName: \"kubernetes.io/projected/35729d8f-7bae-44f7-be5c-054cd97fb39c-kube-api-access-vcqhk\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704445 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35729d8f-7bae-44f7-be5c-054cd97fb39c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704481 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e81c3107-368c-40ff-b16f-4a97cffa7d13-metrics-tls\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704600 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e81c3107-368c-40ff-b16f-4a97cffa7d13-trusted-ca\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704654 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9tqs\" (UniqueName: \"kubernetes.io/projected/a39aeae5-118c-4f69-8908-e35f2b1615b8-kube-api-access-q9tqs\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704707 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e81c3107-368c-40ff-b16f-4a97cffa7d13-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704729 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35729d8f-7bae-44f7-be5c-054cd97fb39c-config\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704769 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a39aeae5-118c-4f69-8908-e35f2b1615b8-machine-approver-tls\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.704793 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35729d8f-7bae-44f7-be5c-054cd97fb39c-serving-cert\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.705994 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-l2rvj"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.706633 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.707680 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fbjn7"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.708744 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.709733 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-459jg"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.710947 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.711086 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.711352 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.712585 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.713331 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ttsbw"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.714495 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p8g6s"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.716906 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2vmv5"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.718458 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.719834 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-459jg"] Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.724033 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.743950 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.763980 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.791828 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.804271 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805537 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lz27\" (UniqueName: \"kubernetes.io/projected/e81c3107-368c-40ff-b16f-4a97cffa7d13-kube-api-access-8lz27\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805613 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39aeae5-118c-4f69-8908-e35f2b1615b8-config\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805649 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn6bj\" (UniqueName: \"kubernetes.io/projected/8192291d-afb3-4169-a5e3-177080eacd7d-kube-api-access-kn6bj\") pod \"migrator-59844c95c7-6vp7k\" (UID: \"8192291d-afb3-4169-a5e3-177080eacd7d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805694 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcqhk\" (UniqueName: \"kubernetes.io/projected/35729d8f-7bae-44f7-be5c-054cd97fb39c-kube-api-access-vcqhk\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805725 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35729d8f-7bae-44f7-be5c-054cd97fb39c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805751 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e81c3107-368c-40ff-b16f-4a97cffa7d13-metrics-tls\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805808 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e81c3107-368c-40ff-b16f-4a97cffa7d13-trusted-ca\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805837 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9tqs\" (UniqueName: \"kubernetes.io/projected/a39aeae5-118c-4f69-8908-e35f2b1615b8-kube-api-access-q9tqs\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805876 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e81c3107-368c-40ff-b16f-4a97cffa7d13-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805914 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35729d8f-7bae-44f7-be5c-054cd97fb39c-config\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805955 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a39aeae5-118c-4f69-8908-e35f2b1615b8-machine-approver-tls\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.805984 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35729d8f-7bae-44f7-be5c-054cd97fb39c-serving-cert\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.806046 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a39aeae5-118c-4f69-8908-e35f2b1615b8-auth-proxy-config\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.806084 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35729d8f-7bae-44f7-be5c-054cd97fb39c-service-ca-bundle\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.807168 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e81c3107-368c-40ff-b16f-4a97cffa7d13-trusted-ca\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.807568 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35729d8f-7bae-44f7-be5c-054cd97fb39c-config\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.808142 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39aeae5-118c-4f69-8908-e35f2b1615b8-config\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.808599 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35729d8f-7bae-44f7-be5c-054cd97fb39c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.808934 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35729d8f-7bae-44f7-be5c-054cd97fb39c-service-ca-bundle\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.809090 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a39aeae5-118c-4f69-8908-e35f2b1615b8-auth-proxy-config\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.811375 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a39aeae5-118c-4f69-8908-e35f2b1615b8-machine-approver-tls\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.813596 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35729d8f-7bae-44f7-be5c-054cd97fb39c-serving-cert\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.813931 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e81c3107-368c-40ff-b16f-4a97cffa7d13-metrics-tls\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.825543 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.845581 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.868632 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.884971 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.904993 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.926721 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.943985 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 02 10:57:47 crc kubenswrapper[4835]: I1002 10:57:47.966489 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:47.999424 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khf59\" (UniqueName: \"kubernetes.io/projected/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-kube-api-access-khf59\") pod \"console-f9d7485db-rzxmz\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.004984 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.008781 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zshpt"] Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.013972 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lm4nm"] Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.015062 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2"] Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.023709 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.057028 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtvdz\" (UniqueName: \"kubernetes.io/projected/7cd86268-0ac9-4583-8285-dc07ea50cc28-kube-api-access-dtvdz\") pod \"console-operator-58897d9998-lz6t7\" (UID: \"7cd86268-0ac9-4583-8285-dc07ea50cc28\") " pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:48 crc kubenswrapper[4835]: W1002 10:57:48.077892 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26659604_60d4_488b_a38b_52fedf8d098d.slice/crio-d624ea6dc827b24bbd7a48a2a677648dd1f90318a22f92e4305d81f99dd245e2 WatchSource:0}: Error finding container d624ea6dc827b24bbd7a48a2a677648dd1f90318a22f92e4305d81f99dd245e2: Status 404 returned error can't find the container with id d624ea6dc827b24bbd7a48a2a677648dd1f90318a22f92e4305d81f99dd245e2 Oct 02 10:57:48 crc kubenswrapper[4835]: W1002 10:57:48.079048 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c7e4e66_b9ef_43fb_a1b9_539bba8ec487.slice/crio-f8ed007ad497e0723d7f590c76360381a05b7ecb7cffb9cea0d7966f7fe6567a WatchSource:0}: Error finding container f8ed007ad497e0723d7f590c76360381a05b7ecb7cffb9cea0d7966f7fe6567a: Status 404 returned error can't find the container with id f8ed007ad497e0723d7f590c76360381a05b7ecb7cffb9cea0d7966f7fe6567a Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.080839 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4dwt\" (UniqueName: \"kubernetes.io/projected/b3318f2b-f771-4c19-ac42-46af048219f9-kube-api-access-z4dwt\") pod \"openshift-apiserver-operator-796bbdcf4f-tx27b\" (UID: \"b3318f2b-f771-4c19-ac42-46af048219f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.086240 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" event={"ID":"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca","Type":"ContainerStarted","Data":"5c4db9ef93d016150457ccad135b96ea4ec16be154b0008dc13afc03b052ab87"} Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.088169 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" event={"ID":"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487","Type":"ContainerStarted","Data":"f8ed007ad497e0723d7f590c76360381a05b7ecb7cffb9cea0d7966f7fe6567a"} Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.090327 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" event={"ID":"26659604-60d4-488b-a38b-52fedf8d098d","Type":"ContainerStarted","Data":"d624ea6dc827b24bbd7a48a2a677648dd1f90318a22f92e4305d81f99dd245e2"} Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.118650 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z56zz\" (UniqueName: \"kubernetes.io/projected/b41c8fbf-c78d-4b6c-8241-a4bbd2654291-kube-api-access-z56zz\") pod \"openshift-config-operator-7777fb866f-gckdg\" (UID: \"b41c8fbf-c78d-4b6c-8241-a4bbd2654291\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.126687 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.143970 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.164169 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.191724 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.204660 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.221405 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.223595 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.244608 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.246187 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.260524 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.265316 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.283672 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.286285 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.307157 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.326872 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.347013 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.363824 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.391204 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.405202 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.407873 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gckdg"] Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.424779 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.443662 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.450130 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rzxmz"] Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.465644 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.487772 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.493112 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b"] Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.505188 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.526389 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.544521 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lz6t7"] Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.564407 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.564713 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc7cn\" (UniqueName: \"kubernetes.io/projected/a021eefd-fd0f-458d-8629-d40a02daa592-kube-api-access-hc7cn\") pod \"apiserver-7bbb656c7d-zctj2\" (UID: \"a021eefd-fd0f-458d-8629-d40a02daa592\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.584990 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.602350 4835 request.go:700] Waited for 1.003349099s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/secrets?fieldSelector=metadata.name%3Detcd-operator-serving-cert&limit=500&resourceVersion=0 Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.604812 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.624754 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.643774 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.664020 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.684731 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: W1002 10:57:48.684960 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3318f2b_f771_4c19_ac42_46af048219f9.slice/crio-7ddee2c390d88a28752076b2743c6d349f021e0efaf0b7c793c615d4a4a28d96 WatchSource:0}: Error finding container 7ddee2c390d88a28752076b2743c6d349f021e0efaf0b7c793c615d4a4a28d96: Status 404 returned error can't find the container with id 7ddee2c390d88a28752076b2743c6d349f021e0efaf0b7c793c615d4a4a28d96 Oct 02 10:57:48 crc kubenswrapper[4835]: W1002 10:57:48.689785 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cd86268_0ac9_4583_8285_dc07ea50cc28.slice/crio-ef84c9e93e9189a2678e0ab6cf93a189f4f91f42b498b04fdb4e048ac63c32a7 WatchSource:0}: Error finding container ef84c9e93e9189a2678e0ab6cf93a189f4f91f42b498b04fdb4e048ac63c32a7: Status 404 returned error can't find the container with id ef84c9e93e9189a2678e0ab6cf93a189f4f91f42b498b04fdb4e048ac63c32a7 Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.703258 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.725258 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.744428 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.785476 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.794250 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdf55\" (UniqueName: \"kubernetes.io/projected/73f4901e-a2c3-474d-8d52-972b775c2017-kube-api-access-kdf55\") pod \"controller-manager-879f6c89f-zl6kd\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.806049 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.825646 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.835139 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.844283 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.851989 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.864085 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.883997 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.904452 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.943072 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwkl9\" (UniqueName: \"kubernetes.io/projected/073951ec-7b3a-4f78-89ee-771308246966-kube-api-access-gwkl9\") pod \"cluster-samples-operator-665b6dd947-dvpwv\" (UID: \"073951ec-7b3a-4f78-89ee-771308246966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.964388 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 02 10:57:48 crc kubenswrapper[4835]: I1002 10:57:48.986925 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.004236 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.014042 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.025254 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.045185 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.065066 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.085732 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.088838 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zl6kd"] Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.096676 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" event={"ID":"b3318f2b-f771-4c19-ac42-46af048219f9","Type":"ContainerStarted","Data":"7ddee2c390d88a28752076b2743c6d349f021e0efaf0b7c793c615d4a4a28d96"} Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.099762 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" event={"ID":"b41c8fbf-c78d-4b6c-8241-a4bbd2654291","Type":"ContainerStarted","Data":"403427ee5e26934c2fce53956c3c828b6b8c15aa8a387819441af74b70f5af9b"} Oct 02 10:57:49 crc kubenswrapper[4835]: W1002 10:57:49.101352 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73f4901e_a2c3_474d_8d52_972b775c2017.slice/crio-1be9255f09240e9ff3743362e9e389fe9d286273b6fba38be644ae5ba1eb5676 WatchSource:0}: Error finding container 1be9255f09240e9ff3743362e9e389fe9d286273b6fba38be644ae5ba1eb5676: Status 404 returned error can't find the container with id 1be9255f09240e9ff3743362e9e389fe9d286273b6fba38be644ae5ba1eb5676 Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.109838 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.113350 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" event={"ID":"26659604-60d4-488b-a38b-52fedf8d098d","Type":"ContainerStarted","Data":"7564451f3bf29d928904ad3a24fb61d841e5a843ae5c48382df4aee66d474f7d"} Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.119868 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lz6t7" event={"ID":"7cd86268-0ac9-4583-8285-dc07ea50cc28","Type":"ContainerStarted","Data":"ef84c9e93e9189a2678e0ab6cf93a189f4f91f42b498b04fdb4e048ac63c32a7"} Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.121496 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rzxmz" event={"ID":"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0","Type":"ContainerStarted","Data":"5e5c620ec42765cebe7c6a975a2f3625089389c75b7439f16e5f37ecc7e02271"} Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.125586 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2"] Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.126438 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.145435 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.164374 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.184670 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.204361 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.212790 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv"] Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.224101 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.244599 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.265137 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.284433 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.304525 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.324016 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.344766 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.364737 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.383988 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.404052 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.425680 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.444879 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.463785 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.485214 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.504445 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.524076 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.547595 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.563944 4835 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.583800 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.622034 4835 request.go:700] Waited for 1.814950044s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/serviceaccounts/ingress-operator/token Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.635279 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9tqs\" (UniqueName: \"kubernetes.io/projected/a39aeae5-118c-4f69-8908-e35f2b1615b8-kube-api-access-q9tqs\") pod \"machine-approver-56656f9798-6qmsd\" (UID: \"a39aeae5-118c-4f69-8908-e35f2b1615b8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.641839 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e81c3107-368c-40ff-b16f-4a97cffa7d13-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.660547 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn6bj\" (UniqueName: \"kubernetes.io/projected/8192291d-afb3-4169-a5e3-177080eacd7d-kube-api-access-kn6bj\") pod \"migrator-59844c95c7-6vp7k\" (UID: \"8192291d-afb3-4169-a5e3-177080eacd7d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.680896 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lz27\" (UniqueName: \"kubernetes.io/projected/e81c3107-368c-40ff-b16f-4a97cffa7d13-kube-api-access-8lz27\") pod \"ingress-operator-5b745b69d9-z44g2\" (UID: \"e81c3107-368c-40ff-b16f-4a97cffa7d13\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.700451 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcqhk\" (UniqueName: \"kubernetes.io/projected/35729d8f-7bae-44f7-be5c-054cd97fb39c-kube-api-access-vcqhk\") pod \"authentication-operator-69f744f599-ffrj5\" (UID: \"35729d8f-7bae-44f7-be5c-054cd97fb39c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.737789 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/27dd8636-3710-41e0-ad06-680140702c28-etcd-client\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.737898 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27dd8636-3710-41e0-ad06-680140702c28-serving-cert\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.737979 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-audit\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.737996 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-stats-auth\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.738019 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r567v\" (UniqueName: \"kubernetes.io/projected/3da125d5-132a-44df-ba26-fd6305dabcdc-kube-api-access-r567v\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.738094 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-trusted-ca\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.738123 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpp67\" (UniqueName: \"kubernetes.io/projected/bcd814c3-6b27-4cd7-a315-0dec8015d04f-kube-api-access-xpp67\") pod \"control-plane-machine-set-operator-78cbb6b69f-44wvb\" (UID: \"bcd814c3-6b27-4cd7-a315-0dec8015d04f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.738158 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da125d5-132a-44df-ba26-fd6305dabcdc-serving-cert\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.738212 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.738289 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121f1415-149d-42d8-a0bd-28aa16e3a55a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4lg7t\" (UID: \"121f1415-149d-42d8-a0bd-28aa16e3a55a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" Oct 02 10:57:49 crc kubenswrapper[4835]: E1002 10:57:49.738618 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.238605153 +0000 UTC m=+146.798512734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.738910 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-default-certificate\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.738959 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-bound-sa-token\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.738981 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/27dd8636-3710-41e0-ad06-680140702c28-encryption-config\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739041 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm77p\" (UniqueName: \"kubernetes.io/projected/495e3390-c592-4669-b313-ca2d397746f7-kube-api-access-qm77p\") pod \"downloads-7954f5f757-6xvln\" (UID: \"495e3390-c592-4669-b313-ca2d397746f7\") " pod="openshift-console/downloads-7954f5f757-6xvln" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739063 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-tls\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739079 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b9aa78d4-be0b-4eb3-9cde-117c72496d16-installation-pull-secrets\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739123 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27dd8636-3710-41e0-ad06-680140702c28-audit-dir\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739146 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-metrics-certs\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739162 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgq4q\" (UniqueName: \"kubernetes.io/projected/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-kube-api-access-wgq4q\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739180 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b9aa78d4-be0b-4eb3-9cde-117c72496d16-ca-trust-extracted\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739195 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-etcd-serving-ca\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739234 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/507a9c92-aeba-4593-8280-6e6244c34b01-config\") pod \"kube-controller-manager-operator-78b949d7b-ghgvd\" (UID: \"507a9c92-aeba-4593-8280-6e6244c34b01\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739268 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4zq5\" (UniqueName: \"kubernetes.io/projected/27dd8636-3710-41e0-ad06-680140702c28-kube-api-access-b4zq5\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739302 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-config\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739356 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/507a9c92-aeba-4593-8280-6e6244c34b01-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ghgvd\" (UID: \"507a9c92-aeba-4593-8280-6e6244c34b01\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739415 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-client-ca\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739432 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-image-import-ca\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739446 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq9zn\" (UniqueName: \"kubernetes.io/projected/121f1415-149d-42d8-a0bd-28aa16e3a55a-kube-api-access-fq9zn\") pod \"openshift-controller-manager-operator-756b6f6bc6-4lg7t\" (UID: \"121f1415-149d-42d8-a0bd-28aa16e3a55a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739499 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gd9l\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-kube-api-access-5gd9l\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739515 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-service-ca-bundle\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739736 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcd814c3-6b27-4cd7-a315-0dec8015d04f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-44wvb\" (UID: \"bcd814c3-6b27-4cd7-a315-0dec8015d04f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739856 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/27dd8636-3710-41e0-ad06-680140702c28-node-pullsecrets\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739930 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-trusted-ca-bundle\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.739961 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/507a9c92-aeba-4593-8280-6e6244c34b01-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ghgvd\" (UID: \"507a9c92-aeba-4593-8280-6e6244c34b01\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.740013 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/121f1415-149d-42d8-a0bd-28aa16e3a55a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4lg7t\" (UID: \"121f1415-149d-42d8-a0bd-28aa16e3a55a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.740033 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-config\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.740087 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-certificates\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.750701 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.776770 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" Oct 02 10:57:49 crc kubenswrapper[4835]: W1002 10:57:49.799441 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda39aeae5_118c_4f69_8908_e35f2b1615b8.slice/crio-a26b7e2888ce734bb16709a544140a96db5d4132b16fa172b7ff31df21969a5d WatchSource:0}: Error finding container a26b7e2888ce734bb16709a544140a96db5d4132b16fa172b7ff31df21969a5d: Status 404 returned error can't find the container with id a26b7e2888ce734bb16709a544140a96db5d4132b16fa172b7ff31df21969a5d Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.808998 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.816737 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841004 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841180 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-config\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841206 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/507a9c92-aeba-4593-8280-6e6244c34b01-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ghgvd\" (UID: \"507a9c92-aeba-4593-8280-6e6244c34b01\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841257 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwqt7\" (UniqueName: \"kubernetes.io/projected/3bab576b-7852-4a66-a119-57f3f3d7c6e5-kube-api-access-fwqt7\") pod \"dns-default-p8g6s\" (UID: \"3bab576b-7852-4a66-a119-57f3f3d7c6e5\") " pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841279 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-mountpoint-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841300 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-client-ca\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841316 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq9zn\" (UniqueName: \"kubernetes.io/projected/121f1415-149d-42d8-a0bd-28aa16e3a55a-kube-api-access-fq9zn\") pod \"openshift-controller-manager-operator-756b6f6bc6-4lg7t\" (UID: \"121f1415-149d-42d8-a0bd-28aa16e3a55a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841335 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gd9l\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-kube-api-access-5gd9l\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841349 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-service-ca-bundle\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841367 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l89kj\" (UniqueName: \"kubernetes.io/projected/7c57d059-e457-4322-9580-32bdf8993a83-kube-api-access-l89kj\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841390 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-grwx8\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841408 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f9bda2-9253-415f-83f1-794c4ba878ce-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ssr48\" (UID: \"91f9bda2-9253-415f-83f1-794c4ba878ce\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841438 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcad80cf-e547-402d-9259-a811a0c542c5-proxy-tls\") pod \"machine-config-controller-84d6567774-gjzmw\" (UID: \"fcad80cf-e547-402d-9259-a811a0c542c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841456 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44t29\" (UniqueName: \"kubernetes.io/projected/380763b9-fdb6-4b62-a8e0-c775708be101-kube-api-access-44t29\") pod \"collect-profiles-29323365-x2ggk\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841478 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/27dd8636-3710-41e0-ad06-680140702c28-node-pullsecrets\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841501 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-trusted-ca-bundle\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841537 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-certificates\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841566 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2jkw\" (UniqueName: \"kubernetes.io/projected/fcad80cf-e547-402d-9259-a811a0c542c5-kube-api-access-s2jkw\") pod \"machine-config-controller-84d6567774-gjzmw\" (UID: \"fcad80cf-e547-402d-9259-a811a0c542c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841588 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fhk2\" (UniqueName: \"kubernetes.io/projected/a6c9709a-6e83-4ccd-9647-82174b1840d3-kube-api-access-8fhk2\") pod \"catalog-operator-68c6474976-kg4tw\" (UID: \"a6c9709a-6e83-4ccd-9647-82174b1840d3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841606 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4818cdd1-5a24-4562-8c2e-133ad936c184-serving-cert\") pod \"service-ca-operator-777779d784-4nmsf\" (UID: \"4818cdd1-5a24-4562-8c2e-133ad936c184\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841634 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fcad80cf-e547-402d-9259-a811a0c542c5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gjzmw\" (UID: \"fcad80cf-e547-402d-9259-a811a0c542c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841651 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/27dd8636-3710-41e0-ad06-680140702c28-etcd-client\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841670 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27dd8636-3710-41e0-ad06-680140702c28-serving-cert\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841686 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91a50967-088f-4901-859e-1d1cb63549e6-metrics-tls\") pod \"dns-operator-744455d44c-ttsbw\" (UID: \"91a50967-088f-4901-859e-1d1cb63549e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841707 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-etcd-ca\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841738 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5dbc29a-2839-4feb-b37b-e929809f1ca9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pqfj4\" (UID: \"c5dbc29a-2839-4feb-b37b-e929809f1ca9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841754 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-audit\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841778 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-images\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841794 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4e59873-8cf1-4151-8ada-8b6ff4df2722-cert\") pod \"ingress-canary-2vmv5\" (UID: \"b4e59873-8cf1-4151-8ada-8b6ff4df2722\") " pod="openshift-ingress-canary/ingress-canary-2vmv5" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841811 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-trusted-ca\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841830 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpp67\" (UniqueName: \"kubernetes.io/projected/bcd814c3-6b27-4cd7-a315-0dec8015d04f-kube-api-access-xpp67\") pod \"control-plane-machine-set-operator-78cbb6b69f-44wvb\" (UID: \"bcd814c3-6b27-4cd7-a315-0dec8015d04f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841863 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da125d5-132a-44df-ba26-fd6305dabcdc-serving-cert\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841881 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90718d42-2c78-46d1-93a1-dfe0adae10f0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7292x\" (UID: \"90718d42-2c78-46d1-93a1-dfe0adae10f0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841899 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/380763b9-fdb6-4b62-a8e0-c775708be101-secret-volume\") pod \"collect-profiles-29323365-x2ggk\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841928 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121f1415-149d-42d8-a0bd-28aa16e3a55a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4lg7t\" (UID: \"121f1415-149d-42d8-a0bd-28aa16e3a55a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841946 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-bound-sa-token\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841968 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-grwx8\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.841986 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/955b6ef4-bf5f-4ac4-aece-03c12040abe4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xn57b\" (UID: \"955b6ef4-bf5f-4ac4-aece-03c12040abe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842003 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-socket-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842021 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/27dd8636-3710-41e0-ad06-680140702c28-encryption-config\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842040 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-tls\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842058 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b9aa78d4-be0b-4eb3-9cde-117c72496d16-installation-pull-secrets\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842074 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27dd8636-3710-41e0-ad06-680140702c28-audit-dir\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842090 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/afe0e854-1ff8-4bad-a5dc-3fc945d7a40f-signing-cabundle\") pod \"service-ca-9c57cc56f-lbswp\" (UID: \"afe0e854-1ff8-4bad-a5dc-3fc945d7a40f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842116 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-metrics-certs\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842134 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4818cdd1-5a24-4562-8c2e-133ad936c184-config\") pod \"service-ca-operator-777779d784-4nmsf\" (UID: \"4818cdd1-5a24-4562-8c2e-133ad936c184\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842153 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/750a63ef-89d8-4df6-8d8b-fdc9125efbe7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4w4pp\" (UID: \"750a63ef-89d8-4df6-8d8b-fdc9125efbe7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842171 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5dbc29a-2839-4feb-b37b-e929809f1ca9-config\") pod \"kube-apiserver-operator-766d6c64bb-pqfj4\" (UID: \"c5dbc29a-2839-4feb-b37b-e929809f1ca9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842188 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46259\" (UniqueName: \"kubernetes.io/projected/d676da73-15a0-474f-8928-ba09d685c500-kube-api-access-46259\") pod \"machine-config-server-hvdzw\" (UID: \"d676da73-15a0-474f-8928-ba09d685c500\") " pod="openshift-machine-config-operator/machine-config-server-hvdzw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842208 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4zq5\" (UniqueName: \"kubernetes.io/projected/27dd8636-3710-41e0-ad06-680140702c28-kube-api-access-b4zq5\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842259 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bab576b-7852-4a66-a119-57f3f3d7c6e5-metrics-tls\") pod \"dns-default-p8g6s\" (UID: \"3bab576b-7852-4a66-a119-57f3f3d7c6e5\") " pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842277 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-image-import-ca\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842295 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6gvm\" (UniqueName: \"kubernetes.io/projected/955b6ef4-bf5f-4ac4-aece-03c12040abe4-kube-api-access-s6gvm\") pod \"package-server-manager-789f6589d5-xn57b\" (UID: \"955b6ef4-bf5f-4ac4-aece-03c12040abe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842311 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-plugins-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842329 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6c9709a-6e83-4ccd-9647-82174b1840d3-srv-cert\") pod \"catalog-operator-68c6474976-kg4tw\" (UID: \"a6c9709a-6e83-4ccd-9647-82174b1840d3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842356 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f597q\" (UniqueName: \"kubernetes.io/projected/afe0e854-1ff8-4bad-a5dc-3fc945d7a40f-kube-api-access-f597q\") pod \"service-ca-9c57cc56f-lbswp\" (UID: \"afe0e854-1ff8-4bad-a5dc-3fc945d7a40f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842375 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcd814c3-6b27-4cd7-a315-0dec8015d04f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-44wvb\" (UID: \"bcd814c3-6b27-4cd7-a315-0dec8015d04f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842404 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f9bda2-9253-415f-83f1-794c4ba878ce-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ssr48\" (UID: \"91f9bda2-9253-415f-83f1-794c4ba878ce\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842421 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90718d42-2c78-46d1-93a1-dfe0adae10f0-srv-cert\") pod \"olm-operator-6b444d44fb-7292x\" (UID: \"90718d42-2c78-46d1-93a1-dfe0adae10f0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:49 crc kubenswrapper[4835]: E1002 10:57:49.842451 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.342422843 +0000 UTC m=+146.902330434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842506 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5f3fb635-33e3-4a97-bcac-42f01899aeb9-tmpfs\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842576 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/507a9c92-aeba-4593-8280-6e6244c34b01-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ghgvd\" (UID: \"507a9c92-aeba-4593-8280-6e6244c34b01\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842608 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/121f1415-149d-42d8-a0bd-28aa16e3a55a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4lg7t\" (UID: \"121f1415-149d-42d8-a0bd-28aa16e3a55a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842639 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d676da73-15a0-474f-8928-ba09d685c500-certs\") pod \"machine-config-server-hvdzw\" (UID: \"d676da73-15a0-474f-8928-ba09d685c500\") " pod="openshift-machine-config-operator/machine-config-server-hvdzw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842682 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-config\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842709 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5dbc29a-2839-4feb-b37b-e929809f1ca9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pqfj4\" (UID: \"c5dbc29a-2839-4feb-b37b-e929809f1ca9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842741 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/afe0e854-1ff8-4bad-a5dc-3fc945d7a40f-signing-key\") pod \"service-ca-9c57cc56f-lbswp\" (UID: \"afe0e854-1ff8-4bad-a5dc-3fc945d7a40f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842768 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/380763b9-fdb6-4b62-a8e0-c775708be101-config-volume\") pod \"collect-profiles-29323365-x2ggk\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842845 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nf9d\" (UniqueName: \"kubernetes.io/projected/90718d42-2c78-46d1-93a1-dfe0adae10f0-kube-api-access-9nf9d\") pod \"olm-operator-6b444d44fb-7292x\" (UID: \"90718d42-2c78-46d1-93a1-dfe0adae10f0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842873 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9jf5\" (UniqueName: \"kubernetes.io/projected/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-kube-api-access-z9jf5\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842899 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-config\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842928 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-csi-data-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842956 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-serving-cert\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.842982 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-etcd-client\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843010 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghm2c\" (UniqueName: \"kubernetes.io/projected/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-kube-api-access-ghm2c\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843037 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-registration-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843080 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xdpf\" (UniqueName: \"kubernetes.io/projected/91a50967-088f-4901-859e-1d1cb63549e6-kube-api-access-9xdpf\") pod \"dns-operator-744455d44c-ttsbw\" (UID: \"91a50967-088f-4901-859e-1d1cb63549e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843123 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-stats-auth\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843167 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d676da73-15a0-474f-8928-ba09d685c500-node-bootstrap-token\") pod \"machine-config-server-hvdzw\" (UID: \"d676da73-15a0-474f-8928-ba09d685c500\") " pod="openshift-machine-config-operator/machine-config-server-hvdzw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843207 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-etcd-service-ca\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843252 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqbhw\" (UniqueName: \"kubernetes.io/projected/5f3fb635-33e3-4a97-bcac-42f01899aeb9-kube-api-access-hqbhw\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843284 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r567v\" (UniqueName: \"kubernetes.io/projected/3da125d5-132a-44df-ba26-fd6305dabcdc-kube-api-access-r567v\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843310 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kflk9\" (UniqueName: \"kubernetes.io/projected/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-kube-api-access-kflk9\") pod \"marketplace-operator-79b997595-grwx8\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843334 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-proxy-tls\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843360 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6c9709a-6e83-4ccd-9647-82174b1840d3-profile-collector-cert\") pod \"catalog-operator-68c6474976-kg4tw\" (UID: \"a6c9709a-6e83-4ccd-9647-82174b1840d3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843387 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750a63ef-89d8-4df6-8d8b-fdc9125efbe7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4w4pp\" (UID: \"750a63ef-89d8-4df6-8d8b-fdc9125efbe7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843413 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bab576b-7852-4a66-a119-57f3f3d7c6e5-config-volume\") pod \"dns-default-p8g6s\" (UID: \"3bab576b-7852-4a66-a119-57f3f3d7c6e5\") " pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843437 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f3fb635-33e3-4a97-bcac-42f01899aeb9-apiservice-cert\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843460 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwnld\" (UniqueName: \"kubernetes.io/projected/b4e59873-8cf1-4151-8ada-8b6ff4df2722-kube-api-access-rwnld\") pod \"ingress-canary-2vmv5\" (UID: \"b4e59873-8cf1-4151-8ada-8b6ff4df2722\") " pod="openshift-ingress-canary/ingress-canary-2vmv5" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843487 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6bls\" (UniqueName: \"kubernetes.io/projected/5211ce1a-3935-4397-8ad3-34fe963cf626-kube-api-access-d6bls\") pod \"multus-admission-controller-857f4d67dd-l2rvj\" (UID: \"5211ce1a-3935-4397-8ad3-34fe963cf626\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843520 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843543 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-default-certificate\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843552 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-audit\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843565 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-config\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.843568 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tff7\" (UniqueName: \"kubernetes.io/projected/91f9bda2-9253-415f-83f1-794c4ba878ce-kube-api-access-5tff7\") pod \"kube-storage-version-migrator-operator-b67b599dd-ssr48\" (UID: \"91f9bda2-9253-415f-83f1-794c4ba878ce\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" Oct 02 10:57:49 crc kubenswrapper[4835]: E1002 10:57:49.844252 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.344242478 +0000 UTC m=+146.904150069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.844925 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/27dd8636-3710-41e0-ad06-680140702c28-node-pullsecrets\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.845016 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-trusted-ca\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.845830 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-client-ca\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.848189 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-trusted-ca-bundle\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.848717 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b9aa78d4-be0b-4eb3-9cde-117c72496d16-installation-pull-secrets\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.848754 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27dd8636-3710-41e0-ad06-680140702c28-audit-dir\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.850088 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/27dd8636-3710-41e0-ad06-680140702c28-etcd-client\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851211 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm77p\" (UniqueName: \"kubernetes.io/projected/495e3390-c592-4669-b313-ca2d397746f7-kube-api-access-qm77p\") pod \"downloads-7954f5f757-6xvln\" (UID: \"495e3390-c592-4669-b313-ca2d397746f7\") " pod="openshift-console/downloads-7954f5f757-6xvln" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851325 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgq4q\" (UniqueName: \"kubernetes.io/projected/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-kube-api-access-wgq4q\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851349 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-certificates\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851355 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5211ce1a-3935-4397-8ad3-34fe963cf626-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-l2rvj\" (UID: \"5211ce1a-3935-4397-8ad3-34fe963cf626\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851425 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-etcd-serving-ca\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851453 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-service-ca-bundle\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851457 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b9aa78d4-be0b-4eb3-9cde-117c72496d16-ca-trust-extracted\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851527 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851558 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f3fb635-33e3-4a97-bcac-42f01899aeb9-webhook-cert\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851601 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/507a9c92-aeba-4593-8280-6e6244c34b01-config\") pod \"kube-controller-manager-operator-78b949d7b-ghgvd\" (UID: \"507a9c92-aeba-4593-8280-6e6244c34b01\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851625 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/750a63ef-89d8-4df6-8d8b-fdc9125efbe7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4w4pp\" (UID: \"750a63ef-89d8-4df6-8d8b-fdc9125efbe7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.851648 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwm2l\" (UniqueName: \"kubernetes.io/projected/4818cdd1-5a24-4562-8c2e-133ad936c184-kube-api-access-wwm2l\") pod \"service-ca-operator-777779d784-4nmsf\" (UID: \"4818cdd1-5a24-4562-8c2e-133ad936c184\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.852258 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-image-import-ca\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.852585 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/507a9c92-aeba-4593-8280-6e6244c34b01-config\") pod \"kube-controller-manager-operator-78b949d7b-ghgvd\" (UID: \"507a9c92-aeba-4593-8280-6e6244c34b01\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.853095 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/27dd8636-3710-41e0-ad06-680140702c28-etcd-serving-ca\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.853945 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcd814c3-6b27-4cd7-a315-0dec8015d04f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-44wvb\" (UID: \"bcd814c3-6b27-4cd7-a315-0dec8015d04f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.854580 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121f1415-149d-42d8-a0bd-28aa16e3a55a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4lg7t\" (UID: \"121f1415-149d-42d8-a0bd-28aa16e3a55a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.854814 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-config\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.855363 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/121f1415-149d-42d8-a0bd-28aa16e3a55a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4lg7t\" (UID: \"121f1415-149d-42d8-a0bd-28aa16e3a55a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.855450 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b9aa78d4-be0b-4eb3-9cde-117c72496d16-ca-trust-extracted\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.856017 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-stats-auth\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.856117 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/27dd8636-3710-41e0-ad06-680140702c28-encryption-config\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.864458 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27dd8636-3710-41e0-ad06-680140702c28-serving-cert\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.865697 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-tls\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.865955 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/507a9c92-aeba-4593-8280-6e6244c34b01-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ghgvd\" (UID: \"507a9c92-aeba-4593-8280-6e6244c34b01\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.866418 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da125d5-132a-44df-ba26-fd6305dabcdc-serving-cert\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.867236 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-metrics-certs\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.869173 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-default-certificate\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.882996 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/507a9c92-aeba-4593-8280-6e6244c34b01-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ghgvd\" (UID: \"507a9c92-aeba-4593-8280-6e6244c34b01\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.923642 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq9zn\" (UniqueName: \"kubernetes.io/projected/121f1415-149d-42d8-a0bd-28aa16e3a55a-kube-api-access-fq9zn\") pod \"openshift-controller-manager-operator-756b6f6bc6-4lg7t\" (UID: \"121f1415-149d-42d8-a0bd-28aa16e3a55a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.942306 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gd9l\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-kube-api-access-5gd9l\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.953916 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:49 crc kubenswrapper[4835]: E1002 10:57:49.954354 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.454190621 +0000 UTC m=+147.014098202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.954516 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwqt7\" (UniqueName: \"kubernetes.io/projected/3bab576b-7852-4a66-a119-57f3f3d7c6e5-kube-api-access-fwqt7\") pod \"dns-default-p8g6s\" (UID: \"3bab576b-7852-4a66-a119-57f3f3d7c6e5\") " pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.954601 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-mountpoint-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.954669 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l89kj\" (UniqueName: \"kubernetes.io/projected/7c57d059-e457-4322-9580-32bdf8993a83-kube-api-access-l89kj\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.954690 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-grwx8\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.954706 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f9bda2-9253-415f-83f1-794c4ba878ce-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ssr48\" (UID: \"91f9bda2-9253-415f-83f1-794c4ba878ce\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.954887 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44t29\" (UniqueName: \"kubernetes.io/projected/380763b9-fdb6-4b62-a8e0-c775708be101-kube-api-access-44t29\") pod \"collect-profiles-29323365-x2ggk\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.954909 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcad80cf-e547-402d-9259-a811a0c542c5-proxy-tls\") pod \"machine-config-controller-84d6567774-gjzmw\" (UID: \"fcad80cf-e547-402d-9259-a811a0c542c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.954952 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2jkw\" (UniqueName: \"kubernetes.io/projected/fcad80cf-e547-402d-9259-a811a0c542c5-kube-api-access-s2jkw\") pod \"machine-config-controller-84d6567774-gjzmw\" (UID: \"fcad80cf-e547-402d-9259-a811a0c542c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.954969 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fhk2\" (UniqueName: \"kubernetes.io/projected/a6c9709a-6e83-4ccd-9647-82174b1840d3-kube-api-access-8fhk2\") pod \"catalog-operator-68c6474976-kg4tw\" (UID: \"a6c9709a-6e83-4ccd-9647-82174b1840d3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.954989 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4818cdd1-5a24-4562-8c2e-133ad936c184-serving-cert\") pod \"service-ca-operator-777779d784-4nmsf\" (UID: \"4818cdd1-5a24-4562-8c2e-133ad936c184\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955011 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fcad80cf-e547-402d-9259-a811a0c542c5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gjzmw\" (UID: \"fcad80cf-e547-402d-9259-a811a0c542c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955033 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91a50967-088f-4901-859e-1d1cb63549e6-metrics-tls\") pod \"dns-operator-744455d44c-ttsbw\" (UID: \"91a50967-088f-4901-859e-1d1cb63549e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955051 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-etcd-ca\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955069 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5dbc29a-2839-4feb-b37b-e929809f1ca9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pqfj4\" (UID: \"c5dbc29a-2839-4feb-b37b-e929809f1ca9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955088 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4e59873-8cf1-4151-8ada-8b6ff4df2722-cert\") pod \"ingress-canary-2vmv5\" (UID: \"b4e59873-8cf1-4151-8ada-8b6ff4df2722\") " pod="openshift-ingress-canary/ingress-canary-2vmv5" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955119 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-images\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955155 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90718d42-2c78-46d1-93a1-dfe0adae10f0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7292x\" (UID: \"90718d42-2c78-46d1-93a1-dfe0adae10f0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955174 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/380763b9-fdb6-4b62-a8e0-c775708be101-secret-volume\") pod \"collect-profiles-29323365-x2ggk\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955195 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-socket-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955236 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-grwx8\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955262 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/955b6ef4-bf5f-4ac4-aece-03c12040abe4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xn57b\" (UID: \"955b6ef4-bf5f-4ac4-aece-03c12040abe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955284 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/afe0e854-1ff8-4bad-a5dc-3fc945d7a40f-signing-cabundle\") pod \"service-ca-9c57cc56f-lbswp\" (UID: \"afe0e854-1ff8-4bad-a5dc-3fc945d7a40f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955308 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/750a63ef-89d8-4df6-8d8b-fdc9125efbe7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4w4pp\" (UID: \"750a63ef-89d8-4df6-8d8b-fdc9125efbe7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955328 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4818cdd1-5a24-4562-8c2e-133ad936c184-config\") pod \"service-ca-operator-777779d784-4nmsf\" (UID: \"4818cdd1-5a24-4562-8c2e-133ad936c184\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955349 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5dbc29a-2839-4feb-b37b-e929809f1ca9-config\") pod \"kube-apiserver-operator-766d6c64bb-pqfj4\" (UID: \"c5dbc29a-2839-4feb-b37b-e929809f1ca9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955370 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46259\" (UniqueName: \"kubernetes.io/projected/d676da73-15a0-474f-8928-ba09d685c500-kube-api-access-46259\") pod \"machine-config-server-hvdzw\" (UID: \"d676da73-15a0-474f-8928-ba09d685c500\") " pod="openshift-machine-config-operator/machine-config-server-hvdzw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955405 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bab576b-7852-4a66-a119-57f3f3d7c6e5-metrics-tls\") pod \"dns-default-p8g6s\" (UID: \"3bab576b-7852-4a66-a119-57f3f3d7c6e5\") " pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955427 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-plugins-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955454 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6gvm\" (UniqueName: \"kubernetes.io/projected/955b6ef4-bf5f-4ac4-aece-03c12040abe4-kube-api-access-s6gvm\") pod \"package-server-manager-789f6589d5-xn57b\" (UID: \"955b6ef4-bf5f-4ac4-aece-03c12040abe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955475 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6c9709a-6e83-4ccd-9647-82174b1840d3-srv-cert\") pod \"catalog-operator-68c6474976-kg4tw\" (UID: \"a6c9709a-6e83-4ccd-9647-82174b1840d3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955498 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f597q\" (UniqueName: \"kubernetes.io/projected/afe0e854-1ff8-4bad-a5dc-3fc945d7a40f-kube-api-access-f597q\") pod \"service-ca-9c57cc56f-lbswp\" (UID: \"afe0e854-1ff8-4bad-a5dc-3fc945d7a40f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955521 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f9bda2-9253-415f-83f1-794c4ba878ce-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ssr48\" (UID: \"91f9bda2-9253-415f-83f1-794c4ba878ce\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955540 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90718d42-2c78-46d1-93a1-dfe0adae10f0-srv-cert\") pod \"olm-operator-6b444d44fb-7292x\" (UID: \"90718d42-2c78-46d1-93a1-dfe0adae10f0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955564 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5f3fb635-33e3-4a97-bcac-42f01899aeb9-tmpfs\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955585 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d676da73-15a0-474f-8928-ba09d685c500-certs\") pod \"machine-config-server-hvdzw\" (UID: \"d676da73-15a0-474f-8928-ba09d685c500\") " pod="openshift-machine-config-operator/machine-config-server-hvdzw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955823 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/380763b9-fdb6-4b62-a8e0-c775708be101-config-volume\") pod \"collect-profiles-29323365-x2ggk\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955890 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5dbc29a-2839-4feb-b37b-e929809f1ca9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pqfj4\" (UID: \"c5dbc29a-2839-4feb-b37b-e929809f1ca9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.955981 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/afe0e854-1ff8-4bad-a5dc-3fc945d7a40f-signing-key\") pod \"service-ca-9c57cc56f-lbswp\" (UID: \"afe0e854-1ff8-4bad-a5dc-3fc945d7a40f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956012 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nf9d\" (UniqueName: \"kubernetes.io/projected/90718d42-2c78-46d1-93a1-dfe0adae10f0-kube-api-access-9nf9d\") pod \"olm-operator-6b444d44fb-7292x\" (UID: \"90718d42-2c78-46d1-93a1-dfe0adae10f0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956069 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9jf5\" (UniqueName: \"kubernetes.io/projected/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-kube-api-access-z9jf5\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956100 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-serving-cert\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956154 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-config\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956177 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-csi-data-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956257 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghm2c\" (UniqueName: \"kubernetes.io/projected/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-kube-api-access-ghm2c\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956317 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-etcd-client\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956341 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-registration-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956401 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xdpf\" (UniqueName: \"kubernetes.io/projected/91a50967-088f-4901-859e-1d1cb63549e6-kube-api-access-9xdpf\") pod \"dns-operator-744455d44c-ttsbw\" (UID: \"91a50967-088f-4901-859e-1d1cb63549e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956429 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d676da73-15a0-474f-8928-ba09d685c500-node-bootstrap-token\") pod \"machine-config-server-hvdzw\" (UID: \"d676da73-15a0-474f-8928-ba09d685c500\") " pod="openshift-machine-config-operator/machine-config-server-hvdzw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956488 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-etcd-service-ca\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956510 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqbhw\" (UniqueName: \"kubernetes.io/projected/5f3fb635-33e3-4a97-bcac-42f01899aeb9-kube-api-access-hqbhw\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956574 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kflk9\" (UniqueName: \"kubernetes.io/projected/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-kube-api-access-kflk9\") pod \"marketplace-operator-79b997595-grwx8\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956598 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-proxy-tls\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957123 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6c9709a-6e83-4ccd-9647-82174b1840d3-profile-collector-cert\") pod \"catalog-operator-68c6474976-kg4tw\" (UID: \"a6c9709a-6e83-4ccd-9647-82174b1840d3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957188 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bab576b-7852-4a66-a119-57f3f3d7c6e5-config-volume\") pod \"dns-default-p8g6s\" (UID: \"3bab576b-7852-4a66-a119-57f3f3d7c6e5\") " pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957215 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750a63ef-89d8-4df6-8d8b-fdc9125efbe7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4w4pp\" (UID: \"750a63ef-89d8-4df6-8d8b-fdc9125efbe7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957261 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f3fb635-33e3-4a97-bcac-42f01899aeb9-apiservice-cert\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957285 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwnld\" (UniqueName: \"kubernetes.io/projected/b4e59873-8cf1-4151-8ada-8b6ff4df2722-kube-api-access-rwnld\") pod \"ingress-canary-2vmv5\" (UID: \"b4e59873-8cf1-4151-8ada-8b6ff4df2722\") " pod="openshift-ingress-canary/ingress-canary-2vmv5" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957311 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6bls\" (UniqueName: \"kubernetes.io/projected/5211ce1a-3935-4397-8ad3-34fe963cf626-kube-api-access-d6bls\") pod \"multus-admission-controller-857f4d67dd-l2rvj\" (UID: \"5211ce1a-3935-4397-8ad3-34fe963cf626\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957342 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957359 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-grwx8\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957362 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tff7\" (UniqueName: \"kubernetes.io/projected/91f9bda2-9253-415f-83f1-794c4ba878ce-kube-api-access-5tff7\") pod \"kube-storage-version-migrator-operator-b67b599dd-ssr48\" (UID: \"91f9bda2-9253-415f-83f1-794c4ba878ce\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957439 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5211ce1a-3935-4397-8ad3-34fe963cf626-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-l2rvj\" (UID: \"5211ce1a-3935-4397-8ad3-34fe963cf626\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957471 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957489 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f3fb635-33e3-4a97-bcac-42f01899aeb9-webhook-cert\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957509 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwm2l\" (UniqueName: \"kubernetes.io/projected/4818cdd1-5a24-4562-8c2e-133ad936c184-kube-api-access-wwm2l\") pod \"service-ca-operator-777779d784-4nmsf\" (UID: \"4818cdd1-5a24-4562-8c2e-133ad936c184\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957525 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/750a63ef-89d8-4df6-8d8b-fdc9125efbe7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4w4pp\" (UID: \"750a63ef-89d8-4df6-8d8b-fdc9125efbe7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.957661 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ffrj5"] Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.959484 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90718d42-2c78-46d1-93a1-dfe0adae10f0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7292x\" (UID: \"90718d42-2c78-46d1-93a1-dfe0adae10f0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.961837 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-etcd-ca\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.962716 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fcad80cf-e547-402d-9259-a811a0c542c5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gjzmw\" (UID: \"fcad80cf-e547-402d-9259-a811a0c542c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.962808 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-images\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.962959 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91a50967-088f-4901-859e-1d1cb63549e6-metrics-tls\") pod \"dns-operator-744455d44c-ttsbw\" (UID: \"91a50967-088f-4901-859e-1d1cb63549e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.963491 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-socket-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.963577 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-serving-cert\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.964500 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5dbc29a-2839-4feb-b37b-e929809f1ca9-config\") pod \"kube-apiserver-operator-766d6c64bb-pqfj4\" (UID: \"c5dbc29a-2839-4feb-b37b-e929809f1ca9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.964721 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/380763b9-fdb6-4b62-a8e0-c775708be101-config-volume\") pod \"collect-profiles-29323365-x2ggk\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.965869 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-registration-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: E1002 10:57:49.966148 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.466129429 +0000 UTC m=+147.026037010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.966277 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-csi-data-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.956177 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-mountpoint-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.968177 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4818cdd1-5a24-4562-8c2e-133ad936c184-config\") pod \"service-ca-operator-777779d784-4nmsf\" (UID: \"4818cdd1-5a24-4562-8c2e-133ad936c184\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.969185 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-etcd-client\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.969292 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-config\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.969564 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/750a63ef-89d8-4df6-8d8b-fdc9125efbe7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4w4pp\" (UID: \"750a63ef-89d8-4df6-8d8b-fdc9125efbe7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.969640 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7c57d059-e457-4322-9580-32bdf8993a83-plugins-dir\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.969887 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4e59873-8cf1-4151-8ada-8b6ff4df2722-cert\") pod \"ingress-canary-2vmv5\" (UID: \"b4e59873-8cf1-4151-8ada-8b6ff4df2722\") " pod="openshift-ingress-canary/ingress-canary-2vmv5" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.969937 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcad80cf-e547-402d-9259-a811a0c542c5-proxy-tls\") pod \"machine-config-controller-84d6567774-gjzmw\" (UID: \"fcad80cf-e547-402d-9259-a811a0c542c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.970822 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f9bda2-9253-415f-83f1-794c4ba878ce-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ssr48\" (UID: \"91f9bda2-9253-415f-83f1-794c4ba878ce\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.972510 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bab576b-7852-4a66-a119-57f3f3d7c6e5-config-volume\") pod \"dns-default-p8g6s\" (UID: \"3bab576b-7852-4a66-a119-57f3f3d7c6e5\") " pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.971519 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.972420 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750a63ef-89d8-4df6-8d8b-fdc9125efbe7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4w4pp\" (UID: \"750a63ef-89d8-4df6-8d8b-fdc9125efbe7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.970891 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5f3fb635-33e3-4a97-bcac-42f01899aeb9-tmpfs\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.972881 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-proxy-tls\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.973044 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-grwx8\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.973350 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/afe0e854-1ff8-4bad-a5dc-3fc945d7a40f-signing-cabundle\") pod \"service-ca-9c57cc56f-lbswp\" (UID: \"afe0e854-1ff8-4bad-a5dc-3fc945d7a40f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.973709 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f9bda2-9253-415f-83f1-794c4ba878ce-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ssr48\" (UID: \"91f9bda2-9253-415f-83f1-794c4ba878ce\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.975802 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bab576b-7852-4a66-a119-57f3f3d7c6e5-metrics-tls\") pod \"dns-default-p8g6s\" (UID: \"3bab576b-7852-4a66-a119-57f3f3d7c6e5\") " pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.975883 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4818cdd1-5a24-4562-8c2e-133ad936c184-serving-cert\") pod \"service-ca-operator-777779d784-4nmsf\" (UID: \"4818cdd1-5a24-4562-8c2e-133ad936c184\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.976438 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6c9709a-6e83-4ccd-9647-82174b1840d3-srv-cert\") pod \"catalog-operator-68c6474976-kg4tw\" (UID: \"a6c9709a-6e83-4ccd-9647-82174b1840d3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.977250 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5dbc29a-2839-4feb-b37b-e929809f1ca9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pqfj4\" (UID: \"c5dbc29a-2839-4feb-b37b-e929809f1ca9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.977667 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f3fb635-33e3-4a97-bcac-42f01899aeb9-webhook-cert\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.978390 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r567v\" (UniqueName: \"kubernetes.io/projected/3da125d5-132a-44df-ba26-fd6305dabcdc-kube-api-access-r567v\") pod \"route-controller-manager-6576b87f9c-shflp\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.981201 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/955b6ef4-bf5f-4ac4-aece-03c12040abe4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xn57b\" (UID: \"955b6ef4-bf5f-4ac4-aece-03c12040abe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.983421 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6c9709a-6e83-4ccd-9647-82174b1840d3-profile-collector-cert\") pod \"catalog-operator-68c6474976-kg4tw\" (UID: \"a6c9709a-6e83-4ccd-9647-82174b1840d3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.984910 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d676da73-15a0-474f-8928-ba09d685c500-certs\") pod \"machine-config-server-hvdzw\" (UID: \"d676da73-15a0-474f-8928-ba09d685c500\") " pod="openshift-machine-config-operator/machine-config-server-hvdzw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.985485 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d676da73-15a0-474f-8928-ba09d685c500-node-bootstrap-token\") pod \"machine-config-server-hvdzw\" (UID: \"d676da73-15a0-474f-8928-ba09d685c500\") " pod="openshift-machine-config-operator/machine-config-server-hvdzw" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.985789 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/380763b9-fdb6-4b62-a8e0-c775708be101-secret-volume\") pod \"collect-profiles-29323365-x2ggk\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.986649 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-etcd-service-ca\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.987333 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f3fb635-33e3-4a97-bcac-42f01899aeb9-apiservice-cert\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.990742 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpp67\" (UniqueName: \"kubernetes.io/projected/bcd814c3-6b27-4cd7-a315-0dec8015d04f-kube-api-access-xpp67\") pod \"control-plane-machine-set-operator-78cbb6b69f-44wvb\" (UID: \"bcd814c3-6b27-4cd7-a315-0dec8015d04f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.992563 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/afe0e854-1ff8-4bad-a5dc-3fc945d7a40f-signing-key\") pod \"service-ca-9c57cc56f-lbswp\" (UID: \"afe0e854-1ff8-4bad-a5dc-3fc945d7a40f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" Oct 02 10:57:49 crc kubenswrapper[4835]: I1002 10:57:49.998950 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90718d42-2c78-46d1-93a1-dfe0adae10f0-srv-cert\") pod \"olm-operator-6b444d44fb-7292x\" (UID: \"90718d42-2c78-46d1-93a1-dfe0adae10f0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.000680 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4zq5\" (UniqueName: \"kubernetes.io/projected/27dd8636-3710-41e0-ad06-680140702c28-kube-api-access-b4zq5\") pod \"apiserver-76f77b778f-99q5b\" (UID: \"27dd8636-3710-41e0-ad06-680140702c28\") " pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.009920 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5211ce1a-3935-4397-8ad3-34fe963cf626-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-l2rvj\" (UID: \"5211ce1a-3935-4397-8ad3-34fe963cf626\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.023170 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm77p\" (UniqueName: \"kubernetes.io/projected/495e3390-c592-4669-b313-ca2d397746f7-kube-api-access-qm77p\") pod \"downloads-7954f5f757-6xvln\" (UID: \"495e3390-c592-4669-b313-ca2d397746f7\") " pod="openshift-console/downloads-7954f5f757-6xvln" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.031136 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.048661 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgq4q\" (UniqueName: \"kubernetes.io/projected/ea6c1cb7-07e8-4a81-9240-5ade44f372cf-kube-api-access-wgq4q\") pod \"router-default-5444994796-vpsrn\" (UID: \"ea6c1cb7-07e8-4a81-9240-5ade44f372cf\") " pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.058571 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.058811 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.558762833 +0000 UTC m=+147.118670414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.059588 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.059717 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.060114 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.560103043 +0000 UTC m=+147.120010614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.067143 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.069610 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2"] Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.080699 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-bound-sa-token\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.096295 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.101150 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2jkw\" (UniqueName: \"kubernetes.io/projected/fcad80cf-e547-402d-9259-a811a0c542c5-kube-api-access-s2jkw\") pod \"machine-config-controller-84d6567774-gjzmw\" (UID: \"fcad80cf-e547-402d-9259-a811a0c542c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.101261 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k"] Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.102180 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.123970 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.125140 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwqt7\" (UniqueName: \"kubernetes.io/projected/3bab576b-7852-4a66-a119-57f3f3d7c6e5-kube-api-access-fwqt7\") pod \"dns-default-p8g6s\" (UID: \"3bab576b-7852-4a66-a119-57f3f3d7c6e5\") " pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.130787 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k" event={"ID":"8192291d-afb3-4169-a5e3-177080eacd7d","Type":"ContainerStarted","Data":"99b9e0cdb4ca59c825d0a83189a11c4f2cb6fd5cf0d1bc0bca488882feb9a5f6"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.133691 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" event={"ID":"b3318f2b-f771-4c19-ac42-46af048219f9","Type":"ContainerStarted","Data":"2f4a796615087838c278b2f0950344e4912cdac159555349a04d6aec39e57cc8"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.158989 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" event={"ID":"73f4901e-a2c3-474d-8d52-972b775c2017","Type":"ContainerStarted","Data":"c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.159054 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" event={"ID":"73f4901e-a2c3-474d-8d52-972b775c2017","Type":"ContainerStarted","Data":"1be9255f09240e9ff3743362e9e389fe9d286273b6fba38be644ae5ba1eb5676"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.159892 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.160607 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.162359 4835 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zl6kd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.162418 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" podUID="73f4901e-a2c3-474d-8d52-972b775c2017" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.163040 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l89kj\" (UniqueName: \"kubernetes.io/projected/7c57d059-e457-4322-9580-32bdf8993a83-kube-api-access-l89kj\") pod \"csi-hostpathplugin-459jg\" (UID: \"7c57d059-e457-4322-9580-32bdf8993a83\") " pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.178129 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.678099049 +0000 UTC m=+147.238006630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.178488 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.180701 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.181927 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.681907443 +0000 UTC m=+147.241815024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.183033 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44t29\" (UniqueName: \"kubernetes.io/projected/380763b9-fdb6-4b62-a8e0-c775708be101-kube-api-access-44t29\") pod \"collect-profiles-29323365-x2ggk\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.190404 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5dbc29a-2839-4feb-b37b-e929809f1ca9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pqfj4\" (UID: \"c5dbc29a-2839-4feb-b37b-e929809f1ca9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.191344 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" event={"ID":"26659604-60d4-488b-a38b-52fedf8d098d","Type":"ContainerStarted","Data":"f30d766ebb36bd870407dfd93e6c7eeab8b39fea53255b4ffd402ad9e7d2c682"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.210298 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6gvm\" (UniqueName: \"kubernetes.io/projected/955b6ef4-bf5f-4ac4-aece-03c12040abe4-kube-api-access-s6gvm\") pod \"package-server-manager-789f6589d5-xn57b\" (UID: \"955b6ef4-bf5f-4ac4-aece-03c12040abe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.214373 4835 generic.go:334] "Generic (PLEG): container finished" podID="b41c8fbf-c78d-4b6c-8241-a4bbd2654291" containerID="5b6117c5e1352fb4d2c77123ff709ed0ea7d568990c29b937cb25109bb8891b4" exitCode=0 Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.215162 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" event={"ID":"b41c8fbf-c78d-4b6c-8241-a4bbd2654291","Type":"ContainerDied","Data":"5b6117c5e1352fb4d2c77123ff709ed0ea7d568990c29b937cb25109bb8891b4"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.215310 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6xvln" Oct 02 10:57:50 crc kubenswrapper[4835]: W1002 10:57:50.221441 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6c1cb7_07e8_4a81_9240_5ade44f372cf.slice/crio-6f7ac617474a6db92c8826710e2c5728f7e0ed5da759eccb8462363d4077c8e6 WatchSource:0}: Error finding container 6f7ac617474a6db92c8826710e2c5728f7e0ed5da759eccb8462363d4077c8e6: Status 404 returned error can't find the container with id 6f7ac617474a6db92c8826710e2c5728f7e0ed5da759eccb8462363d4077c8e6 Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.225962 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fhk2\" (UniqueName: \"kubernetes.io/projected/a6c9709a-6e83-4ccd-9647-82174b1840d3-kube-api-access-8fhk2\") pod \"catalog-operator-68c6474976-kg4tw\" (UID: \"a6c9709a-6e83-4ccd-9647-82174b1840d3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.235361 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" event={"ID":"a39aeae5-118c-4f69-8908-e35f2b1615b8","Type":"ContainerStarted","Data":"a26b7e2888ce734bb16709a544140a96db5d4132b16fa172b7ff31df21969a5d"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.239284 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" event={"ID":"35729d8f-7bae-44f7-be5c-054cd97fb39c","Type":"ContainerStarted","Data":"27ebe8af70827a321e5ef0dd430b59522dfc203d63287d28f3959927ea787048"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.241759 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.244366 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" event={"ID":"073951ec-7b3a-4f78-89ee-771308246966","Type":"ContainerStarted","Data":"d6b1b2d5758cf2dabc209d214ebb2de56cde52584f761b3828db0ea8ed1d004d"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.244392 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" event={"ID":"073951ec-7b3a-4f78-89ee-771308246966","Type":"ContainerStarted","Data":"bd31fa3b212685b12e469c49ade3d7c2113adfec60ad0d8eed25486240b9324d"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.244403 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" event={"ID":"073951ec-7b3a-4f78-89ee-771308246966","Type":"ContainerStarted","Data":"bfb09a2173c63ec45e5e60c563fa0cd5cf9f9aab32fbadef531a5067b7148efa"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.247749 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" event={"ID":"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487","Type":"ContainerStarted","Data":"c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.247856 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46259\" (UniqueName: \"kubernetes.io/projected/d676da73-15a0-474f-8928-ba09d685c500-kube-api-access-46259\") pod \"machine-config-server-hvdzw\" (UID: \"d676da73-15a0-474f-8928-ba09d685c500\") " pod="openshift-machine-config-operator/machine-config-server-hvdzw" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.248360 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.250725 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lz6t7" event={"ID":"7cd86268-0ac9-4583-8285-dc07ea50cc28","Type":"ContainerStarted","Data":"57124c71f0d5586f6ce71c8ea7cafdc7a0abad233100bf03e17a8bf5237a4a82"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.258661 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-lz6t7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.258683 4835 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-lm4nm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.258706 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lz6t7" podUID="7cd86268-0ac9-4583-8285-dc07ea50cc28" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.258735 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" podUID="3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.260924 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/750a63ef-89d8-4df6-8d8b-fdc9125efbe7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4w4pp\" (UID: \"750a63ef-89d8-4df6-8d8b-fdc9125efbe7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.262022 4835 generic.go:334] "Generic (PLEG): container finished" podID="a021eefd-fd0f-458d-8629-d40a02daa592" containerID="7b5d04682de290be498900fd5df1774de474758e4032f9997fba18c7b7e5a0fd" exitCode=0 Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.282051 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.282929 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.283087 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.783064533 +0000 UTC m=+147.342972114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.284195 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nf9d\" (UniqueName: \"kubernetes.io/projected/90718d42-2c78-46d1-93a1-dfe0adae10f0-kube-api-access-9nf9d\") pod \"olm-operator-6b444d44fb-7292x\" (UID: \"90718d42-2c78-46d1-93a1-dfe0adae10f0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.284778 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.285081 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.785068173 +0000 UTC m=+147.344975754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.302068 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9jf5\" (UniqueName: \"kubernetes.io/projected/7facfd69-debd-4f8e-9a69-2ca06fcac7cc-kube-api-access-z9jf5\") pod \"machine-config-operator-74547568cd-tvq9m\" (UID: \"7facfd69-debd-4f8e-9a69-2ca06fcac7cc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.324646 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" event={"ID":"e81c3107-368c-40ff-b16f-4a97cffa7d13","Type":"ContainerStarted","Data":"4a417dca5018ef23fa94e41bbd14953ecde82cfe69cd13adacc4c037a71cf8ee"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.324698 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.324712 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rzxmz" event={"ID":"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0","Type":"ContainerStarted","Data":"11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.324725 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" event={"ID":"32bb0a90-5866-4c3b-b96d-62eb9d4e04ca","Type":"ContainerStarted","Data":"6bf331177e82c27b8dda95ae70af55d728a04b04dab55a811fbf14f39eea31bf"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.324739 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" event={"ID":"a021eefd-fd0f-458d-8629-d40a02daa592","Type":"ContainerDied","Data":"7b5d04682de290be498900fd5df1774de474758e4032f9997fba18c7b7e5a0fd"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.324755 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" event={"ID":"a021eefd-fd0f-458d-8629-d40a02daa592","Type":"ContainerStarted","Data":"6a17e7d7eaf210129f3721d5ea3e66f99cadf5356bd753fbe8fe611b1152cb69"} Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.329458 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.332507 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghm2c\" (UniqueName: \"kubernetes.io/projected/d41cb54d-2237-42c8-81a4-9cefabbf0cd9-kube-api-access-ghm2c\") pod \"etcd-operator-b45778765-fbjn7\" (UID: \"d41cb54d-2237-42c8-81a4-9cefabbf0cd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.341524 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hvdzw" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.346098 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tff7\" (UniqueName: \"kubernetes.io/projected/91f9bda2-9253-415f-83f1-794c4ba878ce-kube-api-access-5tff7\") pod \"kube-storage-version-migrator-operator-b67b599dd-ssr48\" (UID: \"91f9bda2-9253-415f-83f1-794c4ba878ce\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.355030 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-459jg" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.373940 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f597q\" (UniqueName: \"kubernetes.io/projected/afe0e854-1ff8-4bad-a5dc-3fc945d7a40f-kube-api-access-f597q\") pod \"service-ca-9c57cc56f-lbswp\" (UID: \"afe0e854-1ff8-4bad-a5dc-3fc945d7a40f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.383995 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6bls\" (UniqueName: \"kubernetes.io/projected/5211ce1a-3935-4397-8ad3-34fe963cf626-kube-api-access-d6bls\") pod \"multus-admission-controller-857f4d67dd-l2rvj\" (UID: \"5211ce1a-3935-4397-8ad3-34fe963cf626\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.385607 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.385882 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.885864013 +0000 UTC m=+147.445771594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.402972 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqbhw\" (UniqueName: \"kubernetes.io/projected/5f3fb635-33e3-4a97-bcac-42f01899aeb9-kube-api-access-hqbhw\") pod \"packageserver-d55dfcdfc-nzqtr\" (UID: \"5f3fb635-33e3-4a97-bcac-42f01899aeb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.418764 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb"] Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.423967 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kflk9\" (UniqueName: \"kubernetes.io/projected/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-kube-api-access-kflk9\") pod \"marketplace-operator-79b997595-grwx8\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.432684 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.439395 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.446092 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xdpf\" (UniqueName: \"kubernetes.io/projected/91a50967-088f-4901-859e-1d1cb63549e6-kube-api-access-9xdpf\") pod \"dns-operator-744455d44c-ttsbw\" (UID: \"91a50967-088f-4901-859e-1d1cb63549e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.453524 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.457810 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:50 crc kubenswrapper[4835]: W1002 10:57:50.465188 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcd814c3_6b27_4cd7_a315_0dec8015d04f.slice/crio-e98a77c03b256f2d32f182700a397cb1ee56df98e947340f483ffa713edd478e WatchSource:0}: Error finding container e98a77c03b256f2d32f182700a397cb1ee56df98e947340f483ffa713edd478e: Status 404 returned error can't find the container with id e98a77c03b256f2d32f182700a397cb1ee56df98e947340f483ffa713edd478e Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.466474 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwm2l\" (UniqueName: \"kubernetes.io/projected/4818cdd1-5a24-4562-8c2e-133ad936c184-kube-api-access-wwm2l\") pod \"service-ca-operator-777779d784-4nmsf\" (UID: \"4818cdd1-5a24-4562-8c2e-133ad936c184\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.468491 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.474657 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd"] Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.485599 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.486277 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.486610 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:50.986597431 +0000 UTC m=+147.546505012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.491929 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.494070 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwnld\" (UniqueName: \"kubernetes.io/projected/b4e59873-8cf1-4151-8ada-8b6ff4df2722-kube-api-access-rwnld\") pod \"ingress-canary-2vmv5\" (UID: \"b4e59873-8cf1-4151-8ada-8b6ff4df2722\") " pod="openshift-ingress-canary/ingress-canary-2vmv5" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.504780 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.515512 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.520772 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp"] Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.522104 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.529806 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.560805 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6xvln"] Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.579537 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.594848 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.595336 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.095311068 +0000 UTC m=+147.655218649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.597048 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.600610 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-99q5b"] Oct 02 10:57:50 crc kubenswrapper[4835]: W1002 10:57:50.609621 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3da125d5_132a_44df_ba26_fd6305dabcdc.slice/crio-66125fd7bb1b943f57a2bad98d19dfad2552cd2c4726a6ca062b00914b88bde5 WatchSource:0}: Error finding container 66125fd7bb1b943f57a2bad98d19dfad2552cd2c4726a6ca062b00914b88bde5: Status 404 returned error can't find the container with id 66125fd7bb1b943f57a2bad98d19dfad2552cd2c4726a6ca062b00914b88bde5 Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.650111 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2vmv5" Oct 02 10:57:50 crc kubenswrapper[4835]: W1002 10:57:50.675427 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod495e3390_c592_4669_b313_ca2d397746f7.slice/crio-502860e4458dcea148cdc54593440eb021c2eab0d1e78e27bef034f9df0de9a1 WatchSource:0}: Error finding container 502860e4458dcea148cdc54593440eb021c2eab0d1e78e27bef034f9df0de9a1: Status 404 returned error can't find the container with id 502860e4458dcea148cdc54593440eb021c2eab0d1e78e27bef034f9df0de9a1 Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.681885 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw"] Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.693293 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b"] Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.699297 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.699788 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.199759247 +0000 UTC m=+147.759666828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: W1002 10:57:50.731648 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27dd8636_3710_41e0_ad06_680140702c28.slice/crio-b4ea5bad4bc671b68441e1e6582dc4622ad2494dcda9ffcdeb1337a37faaaf95 WatchSource:0}: Error finding container b4ea5bad4bc671b68441e1e6582dc4622ad2494dcda9ffcdeb1337a37faaaf95: Status 404 returned error can't find the container with id b4ea5bad4bc671b68441e1e6582dc4622ad2494dcda9ffcdeb1337a37faaaf95 Oct 02 10:57:50 crc kubenswrapper[4835]: W1002 10:57:50.764382 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955b6ef4_bf5f_4ac4_aece_03c12040abe4.slice/crio-3a4b8f0e9714c2ae11062f137d426fffdf281716f65e7f2ff89dd8a17cbe3d33 WatchSource:0}: Error finding container 3a4b8f0e9714c2ae11062f137d426fffdf281716f65e7f2ff89dd8a17cbe3d33: Status 404 returned error can't find the container with id 3a4b8f0e9714c2ae11062f137d426fffdf281716f65e7f2ff89dd8a17cbe3d33 Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.780445 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p8g6s"] Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.803721 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.804390 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.304363051 +0000 UTC m=+147.864270632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.815632 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t"] Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.875938 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk"] Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.906180 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:50 crc kubenswrapper[4835]: E1002 10:57:50.906652 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.406629595 +0000 UTC m=+147.966537176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:50 crc kubenswrapper[4835]: I1002 10:57:50.980906 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp"] Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.007857 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:51 crc kubenswrapper[4835]: E1002 10:57:51.008261 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.508235529 +0000 UTC m=+148.068143110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.019062 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-lz6t7" podStartSLOduration=126.019031532 podStartE2EDuration="2m6.019031532s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:50.99127119 +0000 UTC m=+147.551178791" watchObservedRunningTime="2025-10-02 10:57:51.019031532 +0000 UTC m=+147.578939113" Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.109062 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:51 crc kubenswrapper[4835]: E1002 10:57:51.109924 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.609905304 +0000 UTC m=+148.169812885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.135501 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tx27b" podStartSLOduration=126.135475491 podStartE2EDuration="2m6.135475491s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:51.133907774 +0000 UTC m=+147.693815355" watchObservedRunningTime="2025-10-02 10:57:51.135475491 +0000 UTC m=+147.695383072" Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.211083 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:51 crc kubenswrapper[4835]: E1002 10:57:51.211768 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.711744725 +0000 UTC m=+148.271652306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.264099 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" podStartSLOduration=126.264075183 podStartE2EDuration="2m6.264075183s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:51.218034464 +0000 UTC m=+147.777942045" watchObservedRunningTime="2025-10-02 10:57:51.264075183 +0000 UTC m=+147.823982764" Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.265800 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" podStartSLOduration=126.265793195 podStartE2EDuration="2m6.265793195s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:51.262182107 +0000 UTC m=+147.822089688" watchObservedRunningTime="2025-10-02 10:57:51.265793195 +0000 UTC m=+147.825700776" Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.312751 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:51 crc kubenswrapper[4835]: E1002 10:57:51.313915 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.813887085 +0000 UTC m=+148.373794846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.327827 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" event={"ID":"a021eefd-fd0f-458d-8629-d40a02daa592","Type":"ContainerStarted","Data":"10cb09a434a3da8c57afeb4447551dc238dd7ea07873931541619a29b2c60425"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.332356 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" event={"ID":"750a63ef-89d8-4df6-8d8b-fdc9125efbe7","Type":"ContainerStarted","Data":"40119da8e0ecd6522d3d26b7b65875aab3855684a877271ba684ac4069c9ac68"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.353690 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dvpwv" podStartSLOduration=126.353666497 podStartE2EDuration="2m6.353666497s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:51.353515173 +0000 UTC m=+147.913422754" watchObservedRunningTime="2025-10-02 10:57:51.353666497 +0000 UTC m=+147.913574078" Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.369844 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" event={"ID":"380763b9-fdb6-4b62-a8e0-c775708be101","Type":"ContainerStarted","Data":"611019d2cc35e40e4fb270f9a5240aff221d1a7a6f93bdb5e8960efe4152ae6e"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.383540 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" event={"ID":"fcad80cf-e547-402d-9259-a811a0c542c5","Type":"ContainerStarted","Data":"ff88133a2cf41a278f876f8288fc5f1dc80ac4f0b218cd96c777433b10a3ef57"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.391165 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" event={"ID":"507a9c92-aeba-4593-8280-6e6244c34b01","Type":"ContainerStarted","Data":"1c0766551fb38593f4da0df86644a5e821c25166cc0c0da777e2a79221fd76f6"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.401129 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p8g6s" event={"ID":"3bab576b-7852-4a66-a119-57f3f3d7c6e5","Type":"ContainerStarted","Data":"adba66cf864a23df71281ae356553411be5af617a59c407c67a6f64042cfef07"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.406438 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k" event={"ID":"8192291d-afb3-4169-a5e3-177080eacd7d","Type":"ContainerStarted","Data":"6717790835c76bf4c31053829250fe20d558fe03c4f0ec5076602f991a99f64a"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.408121 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf"] Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.408829 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" event={"ID":"a39aeae5-118c-4f69-8908-e35f2b1615b8","Type":"ContainerStarted","Data":"93d441240a9c9c73d53aaa82c4eb67430cd85331572737af9a939bf04d41131b"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.410660 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" event={"ID":"35729d8f-7bae-44f7-be5c-054cd97fb39c","Type":"ContainerStarted","Data":"a50567a9ccc7c7e4911687b23d830c687e3da6885f7dd242579c9839dca2a80f"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.411798 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" event={"ID":"27dd8636-3710-41e0-ad06-680140702c28","Type":"ContainerStarted","Data":"b4ea5bad4bc671b68441e1e6582dc4622ad2494dcda9ffcdeb1337a37faaaf95"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.424883 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.426771 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" event={"ID":"e81c3107-368c-40ff-b16f-4a97cffa7d13","Type":"ContainerStarted","Data":"c36a975d4a9767f080a6b04daab09d3af5f3c3a30ed52340273177411c725f23"} Oct 02 10:57:51 crc kubenswrapper[4835]: E1002 10:57:51.428166 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:51.928129118 +0000 UTC m=+148.488036739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.429149 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb" event={"ID":"bcd814c3-6b27-4cd7-a315-0dec8015d04f","Type":"ContainerStarted","Data":"e98a77c03b256f2d32f182700a397cb1ee56df98e947340f483ffa713edd478e"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.434066 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" event={"ID":"b41c8fbf-c78d-4b6c-8241-a4bbd2654291","Type":"ContainerStarted","Data":"5c1fa46c8d52961e637aef22233fe95582e47754f4cb43c19193202c80980902"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.434440 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.437811 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" event={"ID":"121f1415-149d-42d8-a0bd-28aa16e3a55a","Type":"ContainerStarted","Data":"ad0060e704c7b63a2d9bbe6dbb75235ef62ce9c419985bba8e5d3f3b45fb9277"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.441038 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vpsrn" event={"ID":"ea6c1cb7-07e8-4a81-9240-5ade44f372cf","Type":"ContainerStarted","Data":"0413820ff47c0ee57108f0d03b569bec283c2b30dfb4c06adad744a656bc5738"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.441069 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vpsrn" event={"ID":"ea6c1cb7-07e8-4a81-9240-5ade44f372cf","Type":"ContainerStarted","Data":"6f7ac617474a6db92c8826710e2c5728f7e0ed5da759eccb8462363d4077c8e6"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.443670 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" event={"ID":"3da125d5-132a-44df-ba26-fd6305dabcdc","Type":"ContainerStarted","Data":"66125fd7bb1b943f57a2bad98d19dfad2552cd2c4726a6ca062b00914b88bde5"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.447235 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" event={"ID":"955b6ef4-bf5f-4ac4-aece-03c12040abe4","Type":"ContainerStarted","Data":"3a4b8f0e9714c2ae11062f137d426fffdf281716f65e7f2ff89dd8a17cbe3d33"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.450487 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hvdzw" event={"ID":"d676da73-15a0-474f-8928-ba09d685c500","Type":"ContainerStarted","Data":"c67d6270f2a0bb534b3cc254cf95906219cfa648a837f48d477ba3868548fe82"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.453342 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6xvln" event={"ID":"495e3390-c592-4669-b313-ca2d397746f7","Type":"ContainerStarted","Data":"502860e4458dcea148cdc54593440eb021c2eab0d1e78e27bef034f9df0de9a1"} Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.456302 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-lz6t7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.456346 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lz6t7" podUID="7cd86268-0ac9-4583-8285-dc07ea50cc28" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.459002 4835 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zl6kd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.459062 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" podUID="73f4901e-a2c3-474d-8d52-972b775c2017" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.533104 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:51 crc kubenswrapper[4835]: E1002 10:57:51.537059 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:52.036760803 +0000 UTC m=+148.596668384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.587050 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-zshpt" podStartSLOduration=126.587020938 podStartE2EDuration="2m6.587020938s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:51.499421634 +0000 UTC m=+148.059329215" watchObservedRunningTime="2025-10-02 10:57:51.587020938 +0000 UTC m=+148.146928519" Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.657339 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:51 crc kubenswrapper[4835]: E1002 10:57:51.660714 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:52.160694885 +0000 UTC m=+148.720602466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.694754 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tw5z2" podStartSLOduration=126.694728275 podStartE2EDuration="2m6.694728275s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:51.692621982 +0000 UTC m=+148.252529553" watchObservedRunningTime="2025-10-02 10:57:51.694728275 +0000 UTC m=+148.254635866" Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.761726 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:51 crc kubenswrapper[4835]: E1002 10:57:51.762076 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:52.262061632 +0000 UTC m=+148.821969213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.862743 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:51 crc kubenswrapper[4835]: E1002 10:57:51.863349 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:52.363329586 +0000 UTC m=+148.923237167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.968251 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:51 crc kubenswrapper[4835]: E1002 10:57:51.968701 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:52.468684572 +0000 UTC m=+149.028592163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:51 crc kubenswrapper[4835]: I1002 10:57:51.973034 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rzxmz" podStartSLOduration=126.972994632 podStartE2EDuration="2m6.972994632s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:51.947336263 +0000 UTC m=+148.507243864" watchObservedRunningTime="2025-10-02 10:57:51.972994632 +0000 UTC m=+148.532902213" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.048310 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.097138 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.097755 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.097802 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:52 crc kubenswrapper[4835]: E1002 10:57:52.099095 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:52.599072379 +0000 UTC m=+149.158979960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.102852 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lbswp"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.111194 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-459jg"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.112014 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.122278 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.127455 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.127542 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.129523 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.129568 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.140409 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.151796 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.200640 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:52 crc kubenswrapper[4835]: E1002 10:57:52.201015 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:52.700992532 +0000 UTC m=+149.260900113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.201614 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.201657 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.221894 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.236978 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.261119 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ttsbw"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.284560 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.288500 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.308391 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:52 crc kubenswrapper[4835]: E1002 10:57:52.310328 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:52.810300847 +0000 UTC m=+149.370208428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.310430 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:57:52 crc kubenswrapper[4835]: W1002 10:57:52.316776 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f3fb635_33e3_4a97_bcac_42f01899aeb9.slice/crio-0c805745c96234fa7bbf4d4d1cf4feb08c9e926cd4211863c77b8d0c5399ec40 WatchSource:0}: Error finding container 0c805745c96234fa7bbf4d4d1cf4feb08c9e926cd4211863c77b8d0c5399ec40: Status 404 returned error can't find the container with id 0c805745c96234fa7bbf4d4d1cf4feb08c9e926cd4211863c77b8d0c5399ec40 Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.335213 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-grwx8"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.335263 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.384739 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-l2rvj"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.409902 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:52 crc kubenswrapper[4835]: E1002 10:57:52.411666 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:52.911653013 +0000 UTC m=+149.471560594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.434065 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fbjn7"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.437671 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ffrj5" podStartSLOduration=127.437641392 podStartE2EDuration="2m7.437641392s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:52.437105036 +0000 UTC m=+148.997012617" watchObservedRunningTime="2025-10-02 10:57:52.437641392 +0000 UTC m=+148.997548973" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.457846 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.471202 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" podStartSLOduration=127.471162166 podStartE2EDuration="2m7.471162166s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:52.46560949 +0000 UTC m=+149.025517081" watchObservedRunningTime="2025-10-02 10:57:52.471162166 +0000 UTC m=+149.031069747" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.500507 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" podStartSLOduration=127.500488935 podStartE2EDuration="2m7.500488935s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:52.49999217 +0000 UTC m=+149.059899751" watchObservedRunningTime="2025-10-02 10:57:52.500488935 +0000 UTC m=+149.060396516" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.511989 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:52 crc kubenswrapper[4835]: E1002 10:57:52.512678 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:53.012659269 +0000 UTC m=+149.572566850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.517165 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2vmv5"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.519873 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m"] Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.533335 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-vpsrn" podStartSLOduration=127.533317438 podStartE2EDuration="2m7.533317438s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:52.533103942 +0000 UTC m=+149.093011523" watchObservedRunningTime="2025-10-02 10:57:52.533317438 +0000 UTC m=+149.093225019" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.544143 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hvdzw" event={"ID":"d676da73-15a0-474f-8928-ba09d685c500","Type":"ContainerStarted","Data":"1261b6d6aaa455f0b01fd4172bf88be9dc1b8a55d21edcf087006dc8173b9498"} Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.549329 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" event={"ID":"c5dbc29a-2839-4feb-b37b-e929809f1ca9","Type":"ContainerStarted","Data":"e817225c2d9dd899e79caa602fb07332e07685633d6a9bbc2c77807aca83ee48"} Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.560870 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" event={"ID":"121f1415-149d-42d8-a0bd-28aa16e3a55a","Type":"ContainerStarted","Data":"a52c3bbbef3c7ab065eb92d212fb7fab8fa7def67ca16ddb448c1dccf202660c"} Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.567901 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hvdzw" podStartSLOduration=5.567883144 podStartE2EDuration="5.567883144s" podCreationTimestamp="2025-10-02 10:57:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:52.561999347 +0000 UTC m=+149.121906928" watchObservedRunningTime="2025-10-02 10:57:52.567883144 +0000 UTC m=+149.127790725" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.582430 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k" event={"ID":"8192291d-afb3-4169-a5e3-177080eacd7d","Type":"ContainerStarted","Data":"242b5a09aab57f198b996e0e502efd9cbbdda18048b26a1795ad945e49ed4e1b"} Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.587042 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4lg7t" podStartSLOduration=127.587023517 podStartE2EDuration="2m7.587023517s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:52.586139931 +0000 UTC m=+149.146047512" watchObservedRunningTime="2025-10-02 10:57:52.587023517 +0000 UTC m=+149.146931098" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.600524 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" event={"ID":"380763b9-fdb6-4b62-a8e0-c775708be101","Type":"ContainerStarted","Data":"c4fbd1bc8691c74e725304953e117bf2a8b494624f56496be446332544b4ffe7"} Oct 02 10:57:52 crc kubenswrapper[4835]: W1002 10:57:52.612455 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7facfd69_debd_4f8e_9a69_2ca06fcac7cc.slice/crio-4e7ca13c4fbac8228124cc9cc80a3d7c5067e46dc407b77d09edf081eb4fe3b2 WatchSource:0}: Error finding container 4e7ca13c4fbac8228124cc9cc80a3d7c5067e46dc407b77d09edf081eb4fe3b2: Status 404 returned error can't find the container with id 4e7ca13c4fbac8228124cc9cc80a3d7c5067e46dc407b77d09edf081eb4fe3b2 Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.613546 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:52 crc kubenswrapper[4835]: E1002 10:57:52.616166 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:53.1161526 +0000 UTC m=+149.676060181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.665381 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" event={"ID":"a39aeae5-118c-4f69-8908-e35f2b1615b8","Type":"ContainerStarted","Data":"1fc187ef9f485ee877c7578e11a7248cd163a0151862a766fe1fa49241a4bf48"} Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.678117 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb" event={"ID":"bcd814c3-6b27-4cd7-a315-0dec8015d04f","Type":"ContainerStarted","Data":"61bbf57b2c18bee1111980060c7d2784bf76d9ad7e159dae66be75ecd803280f"} Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.690356 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" event={"ID":"750a63ef-89d8-4df6-8d8b-fdc9125efbe7","Type":"ContainerStarted","Data":"b7dca3634701e798a57db19529f8ba134f8098127a3d32c6e50fcaf6bf207523"} Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.695501 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6qmsd" podStartSLOduration=128.695487077 podStartE2EDuration="2m8.695487077s" podCreationTimestamp="2025-10-02 10:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:52.693687293 +0000 UTC m=+149.253594864" watchObservedRunningTime="2025-10-02 10:57:52.695487077 +0000 UTC m=+149.255394658" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.696175 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6vp7k" podStartSLOduration=127.696167427 podStartE2EDuration="2m7.696167427s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:52.638364755 +0000 UTC m=+149.198272336" watchObservedRunningTime="2025-10-02 10:57:52.696167427 +0000 UTC m=+149.256075008" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.714841 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:52 crc kubenswrapper[4835]: E1002 10:57:52.716737 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:53.216720313 +0000 UTC m=+149.776627894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.725262 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4w4pp" podStartSLOduration=127.725241268 podStartE2EDuration="2m7.725241268s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:52.723885997 +0000 UTC m=+149.283793598" watchObservedRunningTime="2025-10-02 10:57:52.725241268 +0000 UTC m=+149.285148849" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.777178 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" event={"ID":"e81c3107-368c-40ff-b16f-4a97cffa7d13","Type":"ContainerStarted","Data":"c79c2db16aeb3092e53dcc6d7c318e81009b6d4477e07cfe9693c5d6f06e6676"} Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.777271 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44wvb" podStartSLOduration=127.777215835 podStartE2EDuration="2m7.777215835s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:52.7747102 +0000 UTC m=+149.334617781" watchObservedRunningTime="2025-10-02 10:57:52.777215835 +0000 UTC m=+149.337123416" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.826622 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:52 crc kubenswrapper[4835]: E1002 10:57:52.828935 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:53.328918904 +0000 UTC m=+149.888826485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.856968 4835 generic.go:334] "Generic (PLEG): container finished" podID="27dd8636-3710-41e0-ad06-680140702c28" containerID="8ab0889234e5a55a1d629090454655117ced53d59b61b6bf27bbd7a4b80500e7" exitCode=0 Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.857132 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" event={"ID":"27dd8636-3710-41e0-ad06-680140702c28","Type":"ContainerDied","Data":"8ab0889234e5a55a1d629090454655117ced53d59b61b6bf27bbd7a4b80500e7"} Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.860758 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z44g2" podStartSLOduration=127.860744818 podStartE2EDuration="2m7.860744818s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:52.857635214 +0000 UTC m=+149.417542795" watchObservedRunningTime="2025-10-02 10:57:52.860744818 +0000 UTC m=+149.420652399" Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.921478 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" event={"ID":"5f3fb635-33e3-4a97-bcac-42f01899aeb9","Type":"ContainerStarted","Data":"0c805745c96234fa7bbf4d4d1cf4feb08c9e926cd4211863c77b8d0c5399ec40"} Oct 02 10:57:52 crc kubenswrapper[4835]: I1002 10:57:52.931580 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:52 crc kubenswrapper[4835]: E1002 10:57:52.932859 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:53.432837527 +0000 UTC m=+149.992745108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.034509 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:53 crc kubenswrapper[4835]: E1002 10:57:53.035079 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:53.535044579 +0000 UTC m=+150.094952160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.128304 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" event={"ID":"507a9c92-aeba-4593-8280-6e6244c34b01","Type":"ContainerStarted","Data":"e23941bdaecab0cce4f6607af3a13f36a97e06a43b9038efd937025c0b437a5e"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.131274 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:57:53 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:57:53 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:57:53 crc kubenswrapper[4835]: healthz check failed Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.131322 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.137251 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:53 crc kubenswrapper[4835]: E1002 10:57:53.137711 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:53.637694245 +0000 UTC m=+150.197601826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.149475 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" event={"ID":"afe0e854-1ff8-4bad-a5dc-3fc945d7a40f","Type":"ContainerStarted","Data":"d5708af2ba1a88bb9777ea4c84c7e5a3836a9dffad545e78744a309df8eafcc1"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.155702 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" event={"ID":"c1ab4193-73f3-4a23-a134-ca28f61c7eb0","Type":"ContainerStarted","Data":"673488e99a732a0ec10e3982c5c0c978beb9d8587e354acfefe10c0e67c4f94e"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.170351 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p8g6s" event={"ID":"3bab576b-7852-4a66-a119-57f3f3d7c6e5","Type":"ContainerStarted","Data":"c6404221e2f0a349ff00f82ca72d99967f77c2165f66c5f1ffba1be82751bbbf"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.176877 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ghgvd" podStartSLOduration=128.176860148 podStartE2EDuration="2m8.176860148s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:53.173296351 +0000 UTC m=+149.733203952" watchObservedRunningTime="2025-10-02 10:57:53.176860148 +0000 UTC m=+149.736767729" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.241515 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:53 crc kubenswrapper[4835]: E1002 10:57:53.241946 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:53.741924517 +0000 UTC m=+150.301832098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.266893 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" event={"ID":"4818cdd1-5a24-4562-8c2e-133ad936c184","Type":"ContainerStarted","Data":"28ac818182f6cd654f01f25017f2b28e01f2af7eaa14e8f38be9e5b48002d79e"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.277768 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" event={"ID":"4818cdd1-5a24-4562-8c2e-133ad936c184","Type":"ContainerStarted","Data":"690d3170f063c63e7a416371ec3caea607150e3edf7994c7ccbb0d75f7569497"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.305760 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4nmsf" podStartSLOduration=128.305734019 podStartE2EDuration="2m8.305734019s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:53.305546353 +0000 UTC m=+149.865453934" watchObservedRunningTime="2025-10-02 10:57:53.305734019 +0000 UTC m=+149.865641590" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.345310 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:53 crc kubenswrapper[4835]: E1002 10:57:53.346743 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:53.846713567 +0000 UTC m=+150.406621148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.397025 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" event={"ID":"fcad80cf-e547-402d-9259-a811a0c542c5","Type":"ContainerStarted","Data":"c1378df1134f355182fcbfdfe116a743772cb6d0daed6753cd90a1b8c3ab2a2f"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.397089 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" event={"ID":"fcad80cf-e547-402d-9259-a811a0c542c5","Type":"ContainerStarted","Data":"f45addf822374db4ff49419029fffbedadc78ecca784146adeb6e6b4bfe0f7e1"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.444389 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-459jg" event={"ID":"7c57d059-e457-4322-9580-32bdf8993a83","Type":"ContainerStarted","Data":"164983c5fffdecf9b8e26e5ec37ae43f723b443cda1e62fe706bec4da581b7be"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.446472 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:53 crc kubenswrapper[4835]: E1002 10:57:53.447954 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:53.947933009 +0000 UTC m=+150.507840590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.457519 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gjzmw" podStartSLOduration=128.457493565 podStartE2EDuration="2m8.457493565s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:53.44564129 +0000 UTC m=+150.005548871" watchObservedRunningTime="2025-10-02 10:57:53.457493565 +0000 UTC m=+150.017401146" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.490503 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" event={"ID":"955b6ef4-bf5f-4ac4-aece-03c12040abe4","Type":"ContainerStarted","Data":"2e1278e3d0e893720d51e07795b85bdc13aae1cb6f51cc1cf5cd160412b379c0"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.490552 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" event={"ID":"955b6ef4-bf5f-4ac4-aece-03c12040abe4","Type":"ContainerStarted","Data":"4cfb064c3657f57423e997a2fd4d64abfe248784e0e86a5ea6254b6656679235"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.491554 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.507765 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6xvln" event={"ID":"495e3390-c592-4669-b313-ca2d397746f7","Type":"ContainerStarted","Data":"90b3fd38543ee1d2e6391bfe7cef5f93deb69d3dd90d31019934609b0a1597b6"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.508834 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6xvln" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.510809 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.510843 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.512420 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" event={"ID":"91f9bda2-9253-415f-83f1-794c4ba878ce","Type":"ContainerStarted","Data":"b22f962d02315bbf5df7be699562a37b9e9cd8028773f460c93fc702f62663a6"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.528963 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" podStartSLOduration=128.528936216 podStartE2EDuration="2m8.528936216s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:53.523586625 +0000 UTC m=+150.083494206" watchObservedRunningTime="2025-10-02 10:57:53.528936216 +0000 UTC m=+150.088843797" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.547232 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:53 crc kubenswrapper[4835]: E1002 10:57:53.548490 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:54.048474991 +0000 UTC m=+150.608382572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.557183 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-6xvln" podStartSLOduration=128.557159191 podStartE2EDuration="2m8.557159191s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:53.551979916 +0000 UTC m=+150.111887497" watchObservedRunningTime="2025-10-02 10:57:53.557159191 +0000 UTC m=+150.117066772" Oct 02 10:57:53 crc kubenswrapper[4835]: W1002 10:57:53.565811 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-c90a21f2cc3d6c56dd414011a489f1eb10b31699cb166e844e11e0b5ed80dcf0 WatchSource:0}: Error finding container c90a21f2cc3d6c56dd414011a489f1eb10b31699cb166e844e11e0b5ed80dcf0: Status 404 returned error can't find the container with id c90a21f2cc3d6c56dd414011a489f1eb10b31699cb166e844e11e0b5ed80dcf0 Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.582363 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" podStartSLOduration=128.582336145 podStartE2EDuration="2m8.582336145s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:53.580770128 +0000 UTC m=+150.140677709" watchObservedRunningTime="2025-10-02 10:57:53.582336145 +0000 UTC m=+150.142243726" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.599882 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" event={"ID":"90718d42-2c78-46d1-93a1-dfe0adae10f0","Type":"ContainerStarted","Data":"05cead85165582853b9c49906c6d38c5e334bca0c5e5c3d2b547797b2ffbb983"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.601528 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.609244 4835 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7292x container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.609344 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" podUID="90718d42-2c78-46d1-93a1-dfe0adae10f0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.620310 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" event={"ID":"3da125d5-132a-44df-ba26-fd6305dabcdc","Type":"ContainerStarted","Data":"115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.621712 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.624080 4835 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-shflp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.624124 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" podUID="3da125d5-132a-44df-ba26-fd6305dabcdc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.648981 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:53 crc kubenswrapper[4835]: E1002 10:57:53.650010 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:54.149993051 +0000 UTC m=+150.709900632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.650456 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" event={"ID":"91a50967-088f-4901-859e-1d1cb63549e6","Type":"ContainerStarted","Data":"be61f4ff878c0b8eeb81bde5bc57fff695ea90534d692f1a508efcc093a4b1a6"} Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.658161 4835 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-gckdg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.658277 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" podUID="b41c8fbf-c78d-4b6c-8241-a4bbd2654291" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.663156 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" podStartSLOduration=128.663127675 podStartE2EDuration="2m8.663127675s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:53.646100005 +0000 UTC m=+150.206007596" watchObservedRunningTime="2025-10-02 10:57:53.663127675 +0000 UTC m=+150.223035256" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.665376 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-lz6t7" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.697525 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" podStartSLOduration=128.697497245 podStartE2EDuration="2m8.697497245s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:53.68364239 +0000 UTC m=+150.243549981" watchObservedRunningTime="2025-10-02 10:57:53.697497245 +0000 UTC m=+150.257404826" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.751367 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:53 crc kubenswrapper[4835]: E1002 10:57:53.753210 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:54.253183753 +0000 UTC m=+150.813091334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.836600 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.837212 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.859906 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:53 crc kubenswrapper[4835]: E1002 10:57:53.860313 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:54.360295032 +0000 UTC m=+150.920202623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:53 crc kubenswrapper[4835]: I1002 10:57:53.961392 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:53 crc kubenswrapper[4835]: E1002 10:57:53.961722 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:54.46170583 +0000 UTC m=+151.021613411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.063249 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.063768 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:54.563746907 +0000 UTC m=+151.123654488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.132745 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:57:54 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:57:54 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:57:54 crc kubenswrapper[4835]: healthz check failed Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.132830 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.164309 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.165275 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:54.665258788 +0000 UTC m=+151.225166369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.165351 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.165595 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:54.665588218 +0000 UTC m=+151.225495799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.269044 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.269419 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:54.769393348 +0000 UTC m=+151.329300929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.371337 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.372009 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:54.871997392 +0000 UTC m=+151.431904973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.477016 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.477575 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:54.977538354 +0000 UTC m=+151.537445935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.579452 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.579852 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:55.079836689 +0000 UTC m=+151.639744270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.682159 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.682353 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:55.182323549 +0000 UTC m=+151.742231140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.683060 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.683470 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:55.183457823 +0000 UTC m=+151.743365404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.684406 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2a79cb377e49e1adfa26bd34437ece188154b32172948cabb715de092452274e"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.684521 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4391cd5436edc25a82b4e5094064e17b7868834803e383815a9cff934504cc02"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.702560 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" event={"ID":"90718d42-2c78-46d1-93a1-dfe0adae10f0","Type":"ContainerStarted","Data":"00a63e57797c36a9353111d50dbb3ddc57b66810af9b49f0636b6fbceb946a2a"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.703898 4835 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7292x container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.703944 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" podUID="90718d42-2c78-46d1-93a1-dfe0adae10f0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.731490 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p8g6s" event={"ID":"3bab576b-7852-4a66-a119-57f3f3d7c6e5","Type":"ContainerStarted","Data":"9413f9808a0672d92d7e3c471e8f8cb75eceee1ba48d1285a970f4a06f12212b"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.732300 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-p8g6s" Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.759774 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" event={"ID":"5f3fb635-33e3-4a97-bcac-42f01899aeb9","Type":"ContainerStarted","Data":"4050e6dc55c1a94f088322c14501cbb4f628ba1114c371c166e218816f66f7b1"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.761576 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.765387 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-nzqtr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.765468 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" podUID="5f3fb635-33e3-4a97-bcac-42f01899aeb9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.781361 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5043600e0e2f85e1120aee2999d15a1b6acd21af2fd4ac4a32d7bb1c75e36033"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.781421 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ea2844d03a1a12b0822ecdbed5be1f9ee845cc170f15dfbc310a6489e3788deb"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.783939 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.784478 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:55.284450119 +0000 UTC m=+151.844357700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.784669 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.785213 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:55.285200881 +0000 UTC m=+151.845108462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.803016 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2vmv5" event={"ID":"b4e59873-8cf1-4151-8ada-8b6ff4df2722","Type":"ContainerStarted","Data":"2a55f1c4065c1325337e3d0763983cce72fc0c1446acdbd3591623b0d9b1505d"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.803359 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2vmv5" event={"ID":"b4e59873-8cf1-4151-8ada-8b6ff4df2722","Type":"ContainerStarted","Data":"82e11c29194b8f6b6f8a8b1340c2467ef0c068e43ad2c5e2f212cd2d0c874661"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.841911 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" event={"ID":"d41cb54d-2237-42c8-81a4-9cefabbf0cd9","Type":"ContainerStarted","Data":"1315085a49cb0336f1710f1c7a49969003d5efc29d5d1bf711999033de0b91c3"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.841988 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" event={"ID":"d41cb54d-2237-42c8-81a4-9cefabbf0cd9","Type":"ContainerStarted","Data":"7cab9687f586843aa03674fc0cd5b4b2816b9488cbe06961ced0971bf4225e65"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.874044 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" event={"ID":"c1ab4193-73f3-4a23-a134-ca28f61c7eb0","Type":"ContainerStarted","Data":"13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.875352 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.879512 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-grwx8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.879648 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" podUID="c1ab4193-73f3-4a23-a134-ca28f61c7eb0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.886078 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.887068 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:55.387047352 +0000 UTC m=+151.946954933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.897750 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" event={"ID":"afe0e854-1ff8-4bad-a5dc-3fc945d7a40f","Type":"ContainerStarted","Data":"a5608cd201bd79263b4cd6ba51acd27ef1b14808aacaf667bf360367547e7a8c"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.937967 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ssr48" event={"ID":"91f9bda2-9253-415f-83f1-794c4ba878ce","Type":"ContainerStarted","Data":"e46329fb85598691f0b6d358dfb910cb5239465d2a8c713628a323bcd94dc3bd"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.976101 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" event={"ID":"c5dbc29a-2839-4feb-b37b-e929809f1ca9","Type":"ContainerStarted","Data":"9fd35ce1fa068c1029abab8fe12493b61388f7a0681a6152e153444196162519"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.990713 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:54 crc kubenswrapper[4835]: E1002 10:57:54.993182 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:55.493150801 +0000 UTC m=+152.053058382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.998100 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" event={"ID":"7facfd69-debd-4f8e-9a69-2ca06fcac7cc","Type":"ContainerStarted","Data":"417d9c67e1cea03cadf8d8d6b15e676e498ecb4865c75097b1fb0443fb9a7a92"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.998236 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" event={"ID":"7facfd69-debd-4f8e-9a69-2ca06fcac7cc","Type":"ContainerStarted","Data":"b8aa367c5b8fc7a2476996c36dcb74c9f5230344ab419b428d8d4f5fb4561caa"} Oct 02 10:57:54 crc kubenswrapper[4835]: I1002 10:57:54.998286 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" event={"ID":"7facfd69-debd-4f8e-9a69-2ca06fcac7cc","Type":"ContainerStarted","Data":"4e7ca13c4fbac8228124cc9cc80a3d7c5067e46dc407b77d09edf081eb4fe3b2"} Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.041588 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" event={"ID":"5211ce1a-3935-4397-8ad3-34fe963cf626","Type":"ContainerStarted","Data":"a338300b866bc21da6017b2c1cdf42eea75f0a99413e69f5efae4ed327571705"} Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.041647 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" event={"ID":"5211ce1a-3935-4397-8ad3-34fe963cf626","Type":"ContainerStarted","Data":"233dd9916eddd354a42719ea169f7ff06ecd4863c8a9a1f59a2756121bb79141"} Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.041657 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" event={"ID":"5211ce1a-3935-4397-8ad3-34fe963cf626","Type":"ContainerStarted","Data":"9d523aa452b0797a8888e022b68c2670f7c7719f0315526a8ec9e3550deb7e48"} Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.044180 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" event={"ID":"91a50967-088f-4901-859e-1d1cb63549e6","Type":"ContainerStarted","Data":"0be43321f0cfd9e9aa5e17323f838fdf751ba0a0b57448275dd411677785294b"} Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.044268 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" event={"ID":"91a50967-088f-4901-859e-1d1cb63549e6","Type":"ContainerStarted","Data":"0e90844d114118b9025ea926e51ed5c94994eeb886e1398b8dccbb2afeb8d48a"} Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.059859 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" event={"ID":"27dd8636-3710-41e0-ad06-680140702c28","Type":"ContainerStarted","Data":"17383bc55ee4e1354eae03896e757187f89f39df768c3312af2f9d9a8935b3cb"} Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.060314 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.062149 4835 patch_prober.go:28] interesting pod/apiserver-76f77b778f-99q5b container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.23:8443/livez\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.062212 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" podUID="27dd8636-3710-41e0-ad06-680140702c28" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.23:8443/livez\": dial tcp 10.217.0.23:8443: connect: connection refused" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.078902 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" event={"ID":"a6c9709a-6e83-4ccd-9647-82174b1840d3","Type":"ContainerStarted","Data":"08e8c3dada9d5186275b73be2359f0418c5d3af20e091684825e1489b626102a"} Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.078972 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" event={"ID":"a6c9709a-6e83-4ccd-9647-82174b1840d3","Type":"ContainerStarted","Data":"7cdf5b08ffc380a28acebead8beb697cf59108e4b47c287129cf9e20a066bdd1"} Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.079000 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.083917 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-kg4tw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.083992 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" podUID="a6c9709a-6e83-4ccd-9647-82174b1840d3" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.091810 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.094754 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"83a3ca606a21ef03f7ff14e18af49946e9db35eb6ece5a1d5a5e09fd5756be09"} Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.094802 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c90a21f2cc3d6c56dd414011a489f1eb10b31699cb166e844e11e0b5ed80dcf0"} Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.094877 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.094939 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:57:55 crc kubenswrapper[4835]: E1002 10:57:55.096305 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:55.596281011 +0000 UTC m=+152.156188592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.099724 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.130266 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.139478 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:57:55 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:57:55 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:57:55 crc kubenswrapper[4835]: healthz check failed Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.139558 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gckdg" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.139608 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.184983 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.199292 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.203308 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ttsbw" podStartSLOduration=130.203283586 podStartE2EDuration="2m10.203283586s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.202270346 +0000 UTC m=+151.762177937" watchObservedRunningTime="2025-10-02 10:57:55.203283586 +0000 UTC m=+151.763191167" Oct 02 10:57:55 crc kubenswrapper[4835]: E1002 10:57:55.212388 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:55.712361678 +0000 UTC m=+152.272269339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.285846 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p8g6s" podStartSLOduration=8.285807309 podStartE2EDuration="8.285807309s" podCreationTimestamp="2025-10-02 10:57:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.284028225 +0000 UTC m=+151.843935816" watchObservedRunningTime="2025-10-02 10:57:55.285807309 +0000 UTC m=+151.845714900" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.304807 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:55 crc kubenswrapper[4835]: E1002 10:57:55.305380 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:55.805356014 +0000 UTC m=+152.365263585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.337744 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" podStartSLOduration=130.337718914 podStartE2EDuration="2m10.337718914s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.330512128 +0000 UTC m=+151.890419709" watchObservedRunningTime="2025-10-02 10:57:55.337718914 +0000 UTC m=+151.897626505" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.377761 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pqfj4" podStartSLOduration=130.377726672 podStartE2EDuration="2m10.377726672s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.374762944 +0000 UTC m=+151.934670525" watchObservedRunningTime="2025-10-02 10:57:55.377726672 +0000 UTC m=+151.937634253" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.406603 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:55 crc kubenswrapper[4835]: E1002 10:57:55.407081 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:55.907066571 +0000 UTC m=+152.466974152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.415163 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fbjn7" podStartSLOduration=130.415142673 podStartE2EDuration="2m10.415142673s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.410450853 +0000 UTC m=+151.970358434" watchObservedRunningTime="2025-10-02 10:57:55.415142673 +0000 UTC m=+151.975050254" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.507444 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:55 crc kubenswrapper[4835]: E1002 10:57:55.507960 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.007934793 +0000 UTC m=+152.567842374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.517177 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tvq9m" podStartSLOduration=130.517152179 podStartE2EDuration="2m10.517152179s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.515386647 +0000 UTC m=+152.075294228" watchObservedRunningTime="2025-10-02 10:57:55.517152179 +0000 UTC m=+152.077059760" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.588446 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-l2rvj" podStartSLOduration=130.588425395 podStartE2EDuration="2m10.588425395s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.564033044 +0000 UTC m=+152.123940635" watchObservedRunningTime="2025-10-02 10:57:55.588425395 +0000 UTC m=+152.148332976" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.619241 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:55 crc kubenswrapper[4835]: E1002 10:57:55.619804 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.119789154 +0000 UTC m=+152.679696735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.721149 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:55 crc kubenswrapper[4835]: E1002 10:57:55.721772 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.221752399 +0000 UTC m=+152.781659980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.725667 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" podStartSLOduration=130.725655236 podStartE2EDuration="2m10.725655236s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.679789422 +0000 UTC m=+152.239697003" watchObservedRunningTime="2025-10-02 10:57:55.725655236 +0000 UTC m=+152.285562817" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.781597 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" podStartSLOduration=130.781568831 podStartE2EDuration="2m10.781568831s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.728243384 +0000 UTC m=+152.288150965" watchObservedRunningTime="2025-10-02 10:57:55.781568831 +0000 UTC m=+152.341476412" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.785461 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2vmv5" podStartSLOduration=8.785444367 podStartE2EDuration="8.785444367s" podCreationTimestamp="2025-10-02 10:57:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.781388876 +0000 UTC m=+152.341296477" watchObservedRunningTime="2025-10-02 10:57:55.785444367 +0000 UTC m=+152.345351948" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.822824 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:55 crc kubenswrapper[4835]: E1002 10:57:55.823510 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.323484397 +0000 UTC m=+152.883391978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.906654 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" podStartSLOduration=130.90661355700001 podStartE2EDuration="2m10.906613557s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.842823596 +0000 UTC m=+152.402731177" watchObservedRunningTime="2025-10-02 10:57:55.906613557 +0000 UTC m=+152.466521138" Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.925031 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:55 crc kubenswrapper[4835]: E1002 10:57:55.925278 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.425238545 +0000 UTC m=+152.985146136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.925440 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:55 crc kubenswrapper[4835]: E1002 10:57:55.925957 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.425933326 +0000 UTC m=+152.985840897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:55 crc kubenswrapper[4835]: I1002 10:57:55.944257 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-lbswp" podStartSLOduration=130.944236234 podStartE2EDuration="2m10.944236234s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.909865045 +0000 UTC m=+152.469772636" watchObservedRunningTime="2025-10-02 10:57:55.944236234 +0000 UTC m=+152.504143815" Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.024644 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" podStartSLOduration=132.024626263 podStartE2EDuration="2m12.024626263s" podCreationTimestamp="2025-10-02 10:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:57:55.971340296 +0000 UTC m=+152.531247877" watchObservedRunningTime="2025-10-02 10:57:56.024626263 +0000 UTC m=+152.584533844" Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.028116 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.028419 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.528371965 +0000 UTC m=+153.088279546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.029416 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.029801 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.529785547 +0000 UTC m=+153.089693128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.103771 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-459jg" event={"ID":"7c57d059-e457-4322-9580-32bdf8993a83","Type":"ContainerStarted","Data":"c2ecae2189faa3a29499984b8565cd623401721a78f64ee12080c965a0d1957d"} Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.107273 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" event={"ID":"27dd8636-3710-41e0-ad06-680140702c28","Type":"ContainerStarted","Data":"69d89162ef22e37449ad40fd878eb78483a13b0d2b383ae1e21f4db7861064b1"} Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.110945 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-grwx8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.111996 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" podUID="c1ab4193-73f3-4a23-a134-ca28f61c7eb0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.110968 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.112514 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.118135 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zctj2" Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.131134 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:57:56 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:57:56 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:57:56 crc kubenswrapper[4835]: healthz check failed Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.131237 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.132104 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.132316 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.632292198 +0000 UTC m=+153.192199789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.132822 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.133288 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.633278038 +0000 UTC m=+153.193185619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.134472 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kg4tw" Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.141637 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7292x" Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.234942 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.237637 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.737614964 +0000 UTC m=+153.297522545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.337365 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.337937 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.837923559 +0000 UTC m=+153.397831140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.444483 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.445984 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:56.945967056 +0000 UTC m=+153.505874627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.549994 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.550473 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.050458416 +0000 UTC m=+153.610365997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.651725 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.651813 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.151761281 +0000 UTC m=+153.711668872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.652462 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.652929 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.152918906 +0000 UTC m=+153.712826497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.753722 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.753991 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.253951793 +0000 UTC m=+153.813859374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.754112 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.754555 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.25454657 +0000 UTC m=+153.814454151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.855151 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.855376 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.35534151 +0000 UTC m=+153.915249091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.855638 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.856031 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.35601165 +0000 UTC m=+153.915919231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:56 crc kubenswrapper[4835]: I1002 10:57:56.956698 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:56 crc kubenswrapper[4835]: E1002 10:57:56.957083 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.457060147 +0000 UTC m=+154.016967728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.058096 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.058870 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.558852647 +0000 UTC m=+154.118760228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.109280 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-nzqtr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.109674 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" podUID="5f3fb635-33e3-4a97-bcac-42f01899aeb9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.120317 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-grwx8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.120386 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" podUID="c1ab4193-73f3-4a23-a134-ca28f61c7eb0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.132163 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:57:57 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:57:57 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:57:57 crc kubenswrapper[4835]: healthz check failed Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.132251 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.159581 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.160205 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.660162551 +0000 UTC m=+154.220070122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.261862 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.262414 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.762388564 +0000 UTC m=+154.322296145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.363415 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.363652 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.863611196 +0000 UTC m=+154.423518767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.364031 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.364450 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.864432021 +0000 UTC m=+154.424339602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.465243 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.465517 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.965467538 +0000 UTC m=+154.525375129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.465565 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.465982 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:57.965965083 +0000 UTC m=+154.525872664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.567517 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.567683 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.067654499 +0000 UTC m=+154.627562080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.567941 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.568408 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.068396201 +0000 UTC m=+154.628303782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.669450 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.669760 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.169705877 +0000 UTC m=+154.729613458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.771773 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.772338 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.272314331 +0000 UTC m=+154.832221902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.776524 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.777719 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.782882 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.803663 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.806868 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.873207 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.873400 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.373367328 +0000 UTC m=+154.933274909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.874183 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3a0029c-0153-4d72-bc57-55249faf7be7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f3a0029c-0153-4d72-bc57-55249faf7be7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.874562 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.874837 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3a0029c-0153-4d72-bc57-55249faf7be7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f3a0029c-0153-4d72-bc57-55249faf7be7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.875067 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.375046458 +0000 UTC m=+154.934954209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.975853 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.976123 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3a0029c-0153-4d72-bc57-55249faf7be7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f3a0029c-0153-4d72-bc57-55249faf7be7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.976202 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3a0029c-0153-4d72-bc57-55249faf7be7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f3a0029c-0153-4d72-bc57-55249faf7be7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:57:57 crc kubenswrapper[4835]: I1002 10:57:57.976484 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3a0029c-0153-4d72-bc57-55249faf7be7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f3a0029c-0153-4d72-bc57-55249faf7be7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:57:57 crc kubenswrapper[4835]: E1002 10:57:57.976618 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.476486377 +0000 UTC m=+155.036393958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.010423 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3a0029c-0153-4d72-bc57-55249faf7be7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f3a0029c-0153-4d72-bc57-55249faf7be7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.077577 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:58 crc kubenswrapper[4835]: E1002 10:57:58.077964 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.577949517 +0000 UTC m=+155.137857098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.109031 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.121146 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-nzqtr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.121244 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" podUID="5f3fb635-33e3-4a97-bcac-42f01899aeb9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.129036 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:57:58 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:57:58 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:57:58 crc kubenswrapper[4835]: healthz check failed Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.129106 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.179265 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:58 crc kubenswrapper[4835]: E1002 10:57:58.179488 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.679450458 +0000 UTC m=+155.239358039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.179782 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:58 crc kubenswrapper[4835]: E1002 10:57:58.180292 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.680275433 +0000 UTC m=+155.240183014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.190930 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8qvnb"] Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.192588 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.197729 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.221862 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qvnb"] Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.248413 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.248690 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.252393 4835 patch_prober.go:28] interesting pod/console-f9d7485db-rzxmz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.252493 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rzxmz" podUID="9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.282726 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.283005 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-catalog-content\") pod \"certified-operators-8qvnb\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.283067 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-utilities\") pod \"certified-operators-8qvnb\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.283086 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhbxv\" (UniqueName: \"kubernetes.io/projected/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-kube-api-access-lhbxv\") pod \"certified-operators-8qvnb\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 10:57:58 crc kubenswrapper[4835]: E1002 10:57:58.283213 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.783194546 +0000 UTC m=+155.343102127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.339547 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nqdkv"] Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.348680 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqdkv" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.358500 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.362468 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqdkv"] Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.387001 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-catalog-content\") pod \"community-operators-nqdkv\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " pod="openshift-marketplace/community-operators-nqdkv" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.387067 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-catalog-content\") pod \"certified-operators-8qvnb\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.387109 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-utilities\") pod \"community-operators-nqdkv\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " pod="openshift-marketplace/community-operators-nqdkv" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.387155 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.387188 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-utilities\") pod \"certified-operators-8qvnb\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.387245 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhbxv\" (UniqueName: \"kubernetes.io/projected/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-kube-api-access-lhbxv\") pod \"certified-operators-8qvnb\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.387715 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt2kj\" (UniqueName: \"kubernetes.io/projected/2557fb3c-0a66-47fb-95b2-64e41d22a740-kube-api-access-pt2kj\") pod \"community-operators-nqdkv\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " pod="openshift-marketplace/community-operators-nqdkv" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.389414 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-catalog-content\") pod \"certified-operators-8qvnb\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 10:57:58 crc kubenswrapper[4835]: E1002 10:57:58.389612 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.889596264 +0000 UTC m=+155.449503845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.390053 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-utilities\") pod \"certified-operators-8qvnb\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.459095 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhbxv\" (UniqueName: \"kubernetes.io/projected/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-kube-api-access-lhbxv\") pod \"certified-operators-8qvnb\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.488240 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.488497 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-utilities\") pod \"community-operators-nqdkv\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " pod="openshift-marketplace/community-operators-nqdkv" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.488571 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt2kj\" (UniqueName: \"kubernetes.io/projected/2557fb3c-0a66-47fb-95b2-64e41d22a740-kube-api-access-pt2kj\") pod \"community-operators-nqdkv\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " pod="openshift-marketplace/community-operators-nqdkv" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.488622 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-catalog-content\") pod \"community-operators-nqdkv\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " pod="openshift-marketplace/community-operators-nqdkv" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.489185 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-catalog-content\") pod \"community-operators-nqdkv\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " pod="openshift-marketplace/community-operators-nqdkv" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.489510 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-utilities\") pod \"community-operators-nqdkv\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " pod="openshift-marketplace/community-operators-nqdkv" Oct 02 10:57:58 crc kubenswrapper[4835]: E1002 10:57:58.490733 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:58.990696023 +0000 UTC m=+155.550603604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.528711 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt2kj\" (UniqueName: \"kubernetes.io/projected/2557fb3c-0a66-47fb-95b2-64e41d22a740-kube-api-access-pt2kj\") pod \"community-operators-nqdkv\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " pod="openshift-marketplace/community-operators-nqdkv" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.534643 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.544693 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p72fs"] Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.571150 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p72fs"] Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.571368 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.599571 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:58 crc kubenswrapper[4835]: E1002 10:57:58.599999 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:59.099983677 +0000 UTC m=+155.659891258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.650869 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.679486 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqdkv" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.700550 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.700822 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-utilities\") pod \"certified-operators-p72fs\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.700858 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-catalog-content\") pod \"certified-operators-p72fs\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.700876 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52bh8\" (UniqueName: \"kubernetes.io/projected/c296c7cb-3e59-473c-af61-8c1bdc9366dc-kube-api-access-52bh8\") pod \"certified-operators-p72fs\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:57:58 crc kubenswrapper[4835]: E1002 10:57:58.700928 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:59.20088506 +0000 UTC m=+155.760792641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.731904 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9gw42"] Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.734967 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.753257 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gw42"] Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.803100 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk4hl\" (UniqueName: \"kubernetes.io/projected/0927eb79-581a-448d-abb3-8c785e24274a-kube-api-access-zk4hl\") pod \"community-operators-9gw42\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.803162 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.803356 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-catalog-content\") pod \"community-operators-9gw42\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.803521 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-utilities\") pod \"certified-operators-p72fs\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:57:58 crc kubenswrapper[4835]: E1002 10:57:58.803562 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:59.303546505 +0000 UTC m=+155.863454096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.803598 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-catalog-content\") pod \"certified-operators-p72fs\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.803624 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52bh8\" (UniqueName: \"kubernetes.io/projected/c296c7cb-3e59-473c-af61-8c1bdc9366dc-kube-api-access-52bh8\") pod \"certified-operators-p72fs\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.803755 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-utilities\") pod \"community-operators-9gw42\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.804705 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-utilities\") pod \"certified-operators-p72fs\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.804764 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-catalog-content\") pod \"certified-operators-p72fs\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.839299 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52bh8\" (UniqueName: \"kubernetes.io/projected/c296c7cb-3e59-473c-af61-8c1bdc9366dc-kube-api-access-52bh8\") pod \"certified-operators-p72fs\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.857892 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.898087 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qvnb"] Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.904366 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.904754 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-utilities\") pod \"community-operators-9gw42\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.904815 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk4hl\" (UniqueName: \"kubernetes.io/projected/0927eb79-581a-448d-abb3-8c785e24274a-kube-api-access-zk4hl\") pod \"community-operators-9gw42\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.904848 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-catalog-content\") pod \"community-operators-9gw42\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.906869 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-utilities\") pod \"community-operators-9gw42\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:57:58 crc kubenswrapper[4835]: E1002 10:57:58.907305 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:59.407214711 +0000 UTC m=+155.967122292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.907457 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-catalog-content\") pod \"community-operators-9gw42\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.938641 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk4hl\" (UniqueName: \"kubernetes.io/projected/0927eb79-581a-448d-abb3-8c785e24274a-kube-api-access-zk4hl\") pod \"community-operators-9gw42\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:57:58 crc kubenswrapper[4835]: I1002 10:57:58.998533 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.008050 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.008538 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:59.508520986 +0000 UTC m=+156.068428567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.046457 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqdkv"] Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.062120 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.108991 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.109296 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:59.609273964 +0000 UTC m=+156.169181535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.130167 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:57:59 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:57:59 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:57:59 crc kubenswrapper[4835]: healthz check failed Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.130255 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.145339 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3a0029c-0153-4d72-bc57-55249faf7be7","Type":"ContainerStarted","Data":"1a52db8e7db53625f491f9cd961211259a0becfc74c3fe6f5b1d715e2e584668"} Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.151982 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qvnb" event={"ID":"038c63fa-b6fc-4725-9f58-e9c3bfe9595d","Type":"ContainerStarted","Data":"8e5ae7078fe9992bbb43e4852b5c8aedf9377bfa1d598772e1959e645bbae320"} Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.152920 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdkv" event={"ID":"2557fb3c-0a66-47fb-95b2-64e41d22a740","Type":"ContainerStarted","Data":"88eced280bcf4ebd3e7a592b39ceae25c2566664f94b89111f5cfbe351469a5d"} Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.155192 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-459jg" event={"ID":"7c57d059-e457-4322-9580-32bdf8993a83","Type":"ContainerStarted","Data":"60d77d1bc0ccc39e2165dfa4ddcf7ec97bdbff35ecb0bae52b7a778404c74c20"} Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.210421 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.210910 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:59.710890979 +0000 UTC m=+156.270798570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.312109 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.312719 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:59.812663237 +0000 UTC m=+156.372570828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.313206 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.314848 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:59.814832812 +0000 UTC m=+156.374740583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.416185 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.416399 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:57:59.916368834 +0000 UTC m=+156.476276415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.417758 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.418721 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:57:59.918698814 +0000 UTC m=+156.478606395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.421210 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gw42"] Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.504274 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p72fs"] Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.519349 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.519521 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.019492674 +0000 UTC m=+156.579400255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.519771 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.520104 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.020090532 +0000 UTC m=+156.579998113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.611726 4835 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.621080 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.621330 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.121295933 +0000 UTC m=+156.681203514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.621440 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.621851 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.12183999 +0000 UTC m=+156.681747571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.723311 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.723587 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.223568537 +0000 UTC m=+156.783476118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.723670 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.724574 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.224523296 +0000 UTC m=+156.784430877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.824822 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.825055 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.325020007 +0000 UTC m=+156.884927588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.825286 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.825696 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.325683717 +0000 UTC m=+156.885591298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.926678 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.926890 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.426853138 +0000 UTC m=+156.986760719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:57:59 crc kubenswrapper[4835]: I1002 10:57:59.927057 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:57:59 crc kubenswrapper[4835]: E1002 10:57:59.927433 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.427419725 +0000 UTC m=+156.987327306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.028795 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:58:00 crc kubenswrapper[4835]: E1002 10:58:00.028991 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.528943656 +0000 UTC m=+157.088851257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.029200 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:58:00 crc kubenswrapper[4835]: E1002 10:58:00.029758 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.52972401 +0000 UTC m=+157.089631591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.060923 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.067912 4835 patch_prober.go:28] interesting pod/apiserver-76f77b778f-99q5b container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 02 10:58:00 crc kubenswrapper[4835]: [+]log ok Oct 02 10:58:00 crc kubenswrapper[4835]: [+]etcd ok Oct 02 10:58:00 crc kubenswrapper[4835]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 02 10:58:00 crc kubenswrapper[4835]: [+]poststarthook/generic-apiserver-start-informers ok Oct 02 10:58:00 crc kubenswrapper[4835]: [+]poststarthook/max-in-flight-filter ok Oct 02 10:58:00 crc kubenswrapper[4835]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 02 10:58:00 crc kubenswrapper[4835]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 02 10:58:00 crc kubenswrapper[4835]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 02 10:58:00 crc kubenswrapper[4835]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 02 10:58:00 crc kubenswrapper[4835]: [+]poststarthook/project.openshift.io-projectcache ok Oct 02 10:58:00 crc kubenswrapper[4835]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 02 10:58:00 crc kubenswrapper[4835]: [+]poststarthook/openshift.io-startinformers ok Oct 02 10:58:00 crc kubenswrapper[4835]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 02 10:58:00 crc kubenswrapper[4835]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 02 10:58:00 crc kubenswrapper[4835]: livez check failed Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.068037 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" podUID="27dd8636-3710-41e0-ad06-680140702c28" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.124466 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.130178 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:58:00 crc kubenswrapper[4835]: E1002 10:58:00.130410 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.630372715 +0000 UTC m=+157.190280306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.132264 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:58:00 crc kubenswrapper[4835]: E1002 10:58:00.132653 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.632635193 +0000 UTC m=+157.192542774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.132832 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:00 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:00 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:00 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.133161 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.163487 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3a0029c-0153-4d72-bc57-55249faf7be7","Type":"ContainerStarted","Data":"d1ec843dc9e9eff2972f19ca57833a1b77750bb5b4fd9225bb8155e17f098646"} Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.167081 4835 generic.go:334] "Generic (PLEG): container finished" podID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerID="5cd481f54a16cd17c403a65a5f2e8479df416f4f079c1b6af0558d770f839e96" exitCode=0 Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.167147 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qvnb" event={"ID":"038c63fa-b6fc-4725-9f58-e9c3bfe9595d","Type":"ContainerDied","Data":"5cd481f54a16cd17c403a65a5f2e8479df416f4f079c1b6af0558d770f839e96"} Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.170133 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.170585 4835 generic.go:334] "Generic (PLEG): container finished" podID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerID="51747b90f271788965e22bf2f4ed6fb17f16eec5188caf44ae8c885d7acb7ca3" exitCode=0 Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.170987 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p72fs" event={"ID":"c296c7cb-3e59-473c-af61-8c1bdc9366dc","Type":"ContainerDied","Data":"51747b90f271788965e22bf2f4ed6fb17f16eec5188caf44ae8c885d7acb7ca3"} Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.172244 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p72fs" event={"ID":"c296c7cb-3e59-473c-af61-8c1bdc9366dc","Type":"ContainerStarted","Data":"27d496e9ee0f5481c60ec8658808663254ca9abb462034ee2c2084f05b0139df"} Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.176999 4835 generic.go:334] "Generic (PLEG): container finished" podID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerID="c9d428ee76da4d306c08dd7cc0200d6f8002556735fc4a23de10378fb94aa22c" exitCode=0 Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.177099 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdkv" event={"ID":"2557fb3c-0a66-47fb-95b2-64e41d22a740","Type":"ContainerDied","Data":"c9d428ee76da4d306c08dd7cc0200d6f8002556735fc4a23de10378fb94aa22c"} Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.186277 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-459jg" event={"ID":"7c57d059-e457-4322-9580-32bdf8993a83","Type":"ContainerStarted","Data":"fa9038c819d9a40a01f73325762b67a33895a325e381b6e28668a8977a214f5f"} Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.186355 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-459jg" event={"ID":"7c57d059-e457-4322-9580-32bdf8993a83","Type":"ContainerStarted","Data":"461b441d1d63f1a2bf60f3e7146215fd7d50dbe30b69ba4022ad8a1ddc3ec69b"} Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.187080 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.187066763 podStartE2EDuration="3.187066763s" podCreationTimestamp="2025-10-02 10:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:58:00.186579419 +0000 UTC m=+156.746487000" watchObservedRunningTime="2025-10-02 10:58:00.187066763 +0000 UTC m=+156.746974344" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.192743 4835 generic.go:334] "Generic (PLEG): container finished" podID="0927eb79-581a-448d-abb3-8c785e24274a" containerID="12901a1f9c85ea6eab2198dabc18084e62ca6e36602e346612080cb79dc0c196" exitCode=0 Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.192786 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gw42" event={"ID":"0927eb79-581a-448d-abb3-8c785e24274a","Type":"ContainerDied","Data":"12901a1f9c85ea6eab2198dabc18084e62ca6e36602e346612080cb79dc0c196"} Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.192807 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gw42" event={"ID":"0927eb79-581a-448d-abb3-8c785e24274a","Type":"ContainerStarted","Data":"feb7641ddd845c62154f1d20b19418aafa063fd10ffa2f974b621b52e7834451"} Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.216895 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.217513 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.216894 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.217811 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.233952 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:58:00 crc kubenswrapper[4835]: E1002 10:58:00.234195 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.734148154 +0000 UTC m=+157.294055895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.235052 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:58:00 crc kubenswrapper[4835]: E1002 10:58:00.244006 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.743979678 +0000 UTC m=+157.303887259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.332408 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6clfp"] Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.334115 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.336490 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.337437 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:58:00 crc kubenswrapper[4835]: E1002 10:58:00.338034 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.838012615 +0000 UTC m=+157.397920196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.345262 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6clfp"] Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.437215 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.439246 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-catalog-content\") pod \"redhat-marketplace-6clfp\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.439344 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkll7\" (UniqueName: \"kubernetes.io/projected/33d53270-0571-4a70-8b20-3b3f181e1a5c-kube-api-access-tkll7\") pod \"redhat-marketplace-6clfp\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.439415 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.439461 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-utilities\") pod \"redhat-marketplace-6clfp\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 10:58:00 crc kubenswrapper[4835]: E1002 10:58:00.439846 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:58:00.939828816 +0000 UTC m=+157.499736397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f29zd" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.457987 4835 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-02T10:57:59.611757368Z","Handler":null,"Name":""} Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.467079 4835 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.467195 4835 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.524648 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nzqtr" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.540982 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.541352 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-utilities\") pod \"redhat-marketplace-6clfp\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.541396 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-catalog-content\") pod \"redhat-marketplace-6clfp\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.541492 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkll7\" (UniqueName: \"kubernetes.io/projected/33d53270-0571-4a70-8b20-3b3f181e1a5c-kube-api-access-tkll7\") pod \"redhat-marketplace-6clfp\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.541949 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-utilities\") pod \"redhat-marketplace-6clfp\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.542037 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-catalog-content\") pod \"redhat-marketplace-6clfp\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.552832 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.563662 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkll7\" (UniqueName: \"kubernetes.io/projected/33d53270-0571-4a70-8b20-3b3f181e1a5c-kube-api-access-tkll7\") pod \"redhat-marketplace-6clfp\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.643205 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.647172 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.647239 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.661057 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.724337 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f29zd\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.740271 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hsq2k"] Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.741407 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.757545 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hsq2k"] Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.766866 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.846111 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-utilities\") pod \"redhat-marketplace-hsq2k\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.846656 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-catalog-content\") pod \"redhat-marketplace-hsq2k\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.846765 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqjn7\" (UniqueName: \"kubernetes.io/projected/db1c8609-9e4d-49d2-9ffe-15075be7ada6-kube-api-access-sqjn7\") pod \"redhat-marketplace-hsq2k\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.885408 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.948052 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-utilities\") pod \"redhat-marketplace-hsq2k\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.948170 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-catalog-content\") pod \"redhat-marketplace-hsq2k\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.948270 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqjn7\" (UniqueName: \"kubernetes.io/projected/db1c8609-9e4d-49d2-9ffe-15075be7ada6-kube-api-access-sqjn7\") pod \"redhat-marketplace-hsq2k\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.950425 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-utilities\") pod \"redhat-marketplace-hsq2k\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.950766 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-catalog-content\") pod \"redhat-marketplace-hsq2k\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 10:58:00 crc kubenswrapper[4835]: I1002 10:58:00.983454 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqjn7\" (UniqueName: \"kubernetes.io/projected/db1c8609-9e4d-49d2-9ffe-15075be7ada6-kube-api-access-sqjn7\") pod \"redhat-marketplace-hsq2k\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.066289 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.080393 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6clfp"] Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.130561 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:01 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:01 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:01 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.130644 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:01 crc kubenswrapper[4835]: W1002 10:58:01.144673 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d53270_0571_4a70_8b20_3b3f181e1a5c.slice/crio-297ed8e37f834102d033a05ccd9b175660f5f1a46a64d681a2b01bf3f612b1d3 WatchSource:0}: Error finding container 297ed8e37f834102d033a05ccd9b175660f5f1a46a64d681a2b01bf3f612b1d3: Status 404 returned error can't find the container with id 297ed8e37f834102d033a05ccd9b175660f5f1a46a64d681a2b01bf3f612b1d3 Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.240555 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6clfp" event={"ID":"33d53270-0571-4a70-8b20-3b3f181e1a5c","Type":"ContainerStarted","Data":"297ed8e37f834102d033a05ccd9b175660f5f1a46a64d681a2b01bf3f612b1d3"} Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.254673 4835 generic.go:334] "Generic (PLEG): container finished" podID="f3a0029c-0153-4d72-bc57-55249faf7be7" containerID="d1ec843dc9e9eff2972f19ca57833a1b77750bb5b4fd9225bb8155e17f098646" exitCode=0 Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.255685 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3a0029c-0153-4d72-bc57-55249faf7be7","Type":"ContainerDied","Data":"d1ec843dc9e9eff2972f19ca57833a1b77750bb5b4fd9225bb8155e17f098646"} Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.347534 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-459jg" podStartSLOduration=14.347503918 podStartE2EDuration="14.347503918s" podCreationTimestamp="2025-10-02 10:57:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:58:01.325722265 +0000 UTC m=+157.885629856" watchObservedRunningTime="2025-10-02 10:58:01.347503918 +0000 UTC m=+157.907411499" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.371984 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f29zd"] Oct 02 10:58:01 crc kubenswrapper[4835]: W1002 10:58:01.387723 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9aa78d4_be0b_4eb3_9cde_117c72496d16.slice/crio-e41f7659d341b1c5737dde89e41b8636204c5e7bcc6fb1240ec70fddd68bdca9 WatchSource:0}: Error finding container e41f7659d341b1c5737dde89e41b8636204c5e7bcc6fb1240ec70fddd68bdca9: Status 404 returned error can't find the container with id e41f7659d341b1c5737dde89e41b8636204c5e7bcc6fb1240ec70fddd68bdca9 Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.409301 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5fdd7"] Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.411642 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.456951 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fdd7"] Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.458727 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.460035 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f8gd\" (UniqueName: \"kubernetes.io/projected/9ce2263d-d93b-48cb-b9a5-ec10967c9730-kube-api-access-7f8gd\") pod \"redhat-operators-5fdd7\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.460115 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-utilities\") pod \"redhat-operators-5fdd7\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.460156 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-catalog-content\") pod \"redhat-operators-5fdd7\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.565540 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f8gd\" (UniqueName: \"kubernetes.io/projected/9ce2263d-d93b-48cb-b9a5-ec10967c9730-kube-api-access-7f8gd\") pod \"redhat-operators-5fdd7\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.565618 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-utilities\") pod \"redhat-operators-5fdd7\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.567661 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-catalog-content\") pod \"redhat-operators-5fdd7\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.568273 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-catalog-content\") pod \"redhat-operators-5fdd7\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.569037 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-utilities\") pod \"redhat-operators-5fdd7\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.619735 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f8gd\" (UniqueName: \"kubernetes.io/projected/9ce2263d-d93b-48cb-b9a5-ec10967c9730-kube-api-access-7f8gd\") pod \"redhat-operators-5fdd7\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.686742 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hsq2k"] Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.729703 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fbwpk"] Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.731085 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.757205 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbwpk"] Oct 02 10:58:01 crc kubenswrapper[4835]: W1002 10:58:01.762020 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb1c8609_9e4d_49d2_9ffe_15075be7ada6.slice/crio-bd16cbd381c0d8150b9ba4fbb56b91156681485bd04523a4ce96f2340c969d40 WatchSource:0}: Error finding container bd16cbd381c0d8150b9ba4fbb56b91156681485bd04523a4ce96f2340c969d40: Status 404 returned error can't find the container with id bd16cbd381c0d8150b9ba4fbb56b91156681485bd04523a4ce96f2340c969d40 Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.818498 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.870754 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-utilities\") pod \"redhat-operators-fbwpk\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.870813 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jgn6\" (UniqueName: \"kubernetes.io/projected/6cbd9159-5af9-4954-a9d6-29e3abb32763-kube-api-access-7jgn6\") pod \"redhat-operators-fbwpk\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.870908 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-catalog-content\") pod \"redhat-operators-fbwpk\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.973455 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-catalog-content\") pod \"redhat-operators-fbwpk\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.973516 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-utilities\") pod \"redhat-operators-fbwpk\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.973539 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jgn6\" (UniqueName: \"kubernetes.io/projected/6cbd9159-5af9-4954-a9d6-29e3abb32763-kube-api-access-7jgn6\") pod \"redhat-operators-fbwpk\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.974042 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-catalog-content\") pod \"redhat-operators-fbwpk\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.974299 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-utilities\") pod \"redhat-operators-fbwpk\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 10:58:01 crc kubenswrapper[4835]: I1002 10:58:01.998646 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jgn6\" (UniqueName: \"kubernetes.io/projected/6cbd9159-5af9-4954-a9d6-29e3abb32763-kube-api-access-7jgn6\") pod \"redhat-operators-fbwpk\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.064405 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.128748 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:02 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:02 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:02 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.141271 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.265428 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.295307 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fdd7"] Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.333033 4835 generic.go:334] "Generic (PLEG): container finished" podID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" containerID="2c74537f2f7f983a77be0c1cfba9e4697bfa9b5bd58dd46f3c76c2ed3a94163a" exitCode=0 Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.333445 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hsq2k" event={"ID":"db1c8609-9e4d-49d2-9ffe-15075be7ada6","Type":"ContainerDied","Data":"2c74537f2f7f983a77be0c1cfba9e4697bfa9b5bd58dd46f3c76c2ed3a94163a"} Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.333511 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hsq2k" event={"ID":"db1c8609-9e4d-49d2-9ffe-15075be7ada6","Type":"ContainerStarted","Data":"bd16cbd381c0d8150b9ba4fbb56b91156681485bd04523a4ce96f2340c969d40"} Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.339303 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p8g6s" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.339357 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" event={"ID":"b9aa78d4-be0b-4eb3-9cde-117c72496d16","Type":"ContainerStarted","Data":"0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be"} Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.339386 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" event={"ID":"b9aa78d4-be0b-4eb3-9cde-117c72496d16","Type":"ContainerStarted","Data":"e41f7659d341b1c5737dde89e41b8636204c5e7bcc6fb1240ec70fddd68bdca9"} Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.339921 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.359135 4835 generic.go:334] "Generic (PLEG): container finished" podID="33d53270-0571-4a70-8b20-3b3f181e1a5c" containerID="994ac5b988c2b7963a6fb85cf66ca729b5aa65cd2ba33c1a3485fc6c07cbc854" exitCode=0 Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.360031 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6clfp" event={"ID":"33d53270-0571-4a70-8b20-3b3f181e1a5c","Type":"ContainerDied","Data":"994ac5b988c2b7963a6fb85cf66ca729b5aa65cd2ba33c1a3485fc6c07cbc854"} Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.413442 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbwpk"] Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.419502 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" podStartSLOduration=137.419456033 podStartE2EDuration="2m17.419456033s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:58:02.406669159 +0000 UTC m=+158.966576740" watchObservedRunningTime="2025-10-02 10:58:02.419456033 +0000 UTC m=+158.979363614" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.658534 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.787485 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3a0029c-0153-4d72-bc57-55249faf7be7-kubelet-dir\") pod \"f3a0029c-0153-4d72-bc57-55249faf7be7\" (UID: \"f3a0029c-0153-4d72-bc57-55249faf7be7\") " Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.787620 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3a0029c-0153-4d72-bc57-55249faf7be7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f3a0029c-0153-4d72-bc57-55249faf7be7" (UID: "f3a0029c-0153-4d72-bc57-55249faf7be7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.787662 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3a0029c-0153-4d72-bc57-55249faf7be7-kube-api-access\") pod \"f3a0029c-0153-4d72-bc57-55249faf7be7\" (UID: \"f3a0029c-0153-4d72-bc57-55249faf7be7\") " Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.787999 4835 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3a0029c-0153-4d72-bc57-55249faf7be7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.799484 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a0029c-0153-4d72-bc57-55249faf7be7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f3a0029c-0153-4d72-bc57-55249faf7be7" (UID: "f3a0029c-0153-4d72-bc57-55249faf7be7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.884269 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 10:58:02 crc kubenswrapper[4835]: E1002 10:58:02.884673 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a0029c-0153-4d72-bc57-55249faf7be7" containerName="pruner" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.884689 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a0029c-0153-4d72-bc57-55249faf7be7" containerName="pruner" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.884852 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a0029c-0153-4d72-bc57-55249faf7be7" containerName="pruner" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.885299 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.887866 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.887919 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.888752 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3a0029c-0153-4d72-bc57-55249faf7be7-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.899145 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.990025 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:58:02 crc kubenswrapper[4835]: I1002 10:58:02.990506 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.091981 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.092157 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.092334 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.114311 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.128518 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:03 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:03 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:03 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.128583 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.211437 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.372574 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3a0029c-0153-4d72-bc57-55249faf7be7","Type":"ContainerDied","Data":"1a52db8e7db53625f491f9cd961211259a0becfc74c3fe6f5b1d715e2e584668"} Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.372622 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a52db8e7db53625f491f9cd961211259a0becfc74c3fe6f5b1d715e2e584668" Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.372617 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.380900 4835 generic.go:334] "Generic (PLEG): container finished" podID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" containerID="9f74805427fdc090fac79df212397ee54a0e9bad59628715621385fd19c2b65f" exitCode=0 Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.380980 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdd7" event={"ID":"9ce2263d-d93b-48cb-b9a5-ec10967c9730","Type":"ContainerDied","Data":"9f74805427fdc090fac79df212397ee54a0e9bad59628715621385fd19c2b65f"} Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.381030 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdd7" event={"ID":"9ce2263d-d93b-48cb-b9a5-ec10967c9730","Type":"ContainerStarted","Data":"3796abc5c3b875b41c13a0f5dea8e8b6898ac1dceddb8a60ec0976a6d8a2b439"} Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.392648 4835 generic.go:334] "Generic (PLEG): container finished" podID="380763b9-fdb6-4b62-a8e0-c775708be101" containerID="c4fbd1bc8691c74e725304953e117bf2a8b494624f56496be446332544b4ffe7" exitCode=0 Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.392772 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" event={"ID":"380763b9-fdb6-4b62-a8e0-c775708be101","Type":"ContainerDied","Data":"c4fbd1bc8691c74e725304953e117bf2a8b494624f56496be446332544b4ffe7"} Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.398193 4835 generic.go:334] "Generic (PLEG): container finished" podID="6cbd9159-5af9-4954-a9d6-29e3abb32763" containerID="e4054a622a1b04cfea34b409255b7e5cb26bfbd3bf019a4834acb04179b4a270" exitCode=0 Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.398334 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbwpk" event={"ID":"6cbd9159-5af9-4954-a9d6-29e3abb32763","Type":"ContainerDied","Data":"e4054a622a1b04cfea34b409255b7e5cb26bfbd3bf019a4834acb04179b4a270"} Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.398369 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbwpk" event={"ID":"6cbd9159-5af9-4954-a9d6-29e3abb32763","Type":"ContainerStarted","Data":"9659b2476b786603e82cfd265ff653e2b9873223d1839219b641dc6efc8017f4"} Oct 02 10:58:03 crc kubenswrapper[4835]: I1002 10:58:03.551109 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 10:58:04 crc kubenswrapper[4835]: I1002 10:58:04.129858 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:04 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:04 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:04 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:04 crc kubenswrapper[4835]: I1002 10:58:04.130411 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:04 crc kubenswrapper[4835]: I1002 10:58:04.422928 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4","Type":"ContainerStarted","Data":"7cb4379c6e592c1ded26c8d018e626c3117bdfd45d313a6814a8d9671299d9c0"} Oct 02 10:58:04 crc kubenswrapper[4835]: I1002 10:58:04.761414 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:58:04 crc kubenswrapper[4835]: I1002 10:58:04.925021 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/380763b9-fdb6-4b62-a8e0-c775708be101-config-volume\") pod \"380763b9-fdb6-4b62-a8e0-c775708be101\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " Oct 02 10:58:04 crc kubenswrapper[4835]: I1002 10:58:04.925163 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/380763b9-fdb6-4b62-a8e0-c775708be101-secret-volume\") pod \"380763b9-fdb6-4b62-a8e0-c775708be101\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " Oct 02 10:58:04 crc kubenswrapper[4835]: I1002 10:58:04.925185 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44t29\" (UniqueName: \"kubernetes.io/projected/380763b9-fdb6-4b62-a8e0-c775708be101-kube-api-access-44t29\") pod \"380763b9-fdb6-4b62-a8e0-c775708be101\" (UID: \"380763b9-fdb6-4b62-a8e0-c775708be101\") " Oct 02 10:58:04 crc kubenswrapper[4835]: I1002 10:58:04.926595 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/380763b9-fdb6-4b62-a8e0-c775708be101-config-volume" (OuterVolumeSpecName: "config-volume") pod "380763b9-fdb6-4b62-a8e0-c775708be101" (UID: "380763b9-fdb6-4b62-a8e0-c775708be101"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:58:04 crc kubenswrapper[4835]: I1002 10:58:04.934465 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/380763b9-fdb6-4b62-a8e0-c775708be101-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "380763b9-fdb6-4b62-a8e0-c775708be101" (UID: "380763b9-fdb6-4b62-a8e0-c775708be101"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:58:04 crc kubenswrapper[4835]: I1002 10:58:04.936950 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/380763b9-fdb6-4b62-a8e0-c775708be101-kube-api-access-44t29" (OuterVolumeSpecName: "kube-api-access-44t29") pod "380763b9-fdb6-4b62-a8e0-c775708be101" (UID: "380763b9-fdb6-4b62-a8e0-c775708be101"). InnerVolumeSpecName "kube-api-access-44t29". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.027311 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/380763b9-fdb6-4b62-a8e0-c775708be101-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.027344 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/380763b9-fdb6-4b62-a8e0-c775708be101-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.027355 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44t29\" (UniqueName: \"kubernetes.io/projected/380763b9-fdb6-4b62-a8e0-c775708be101-kube-api-access-44t29\") on node \"crc\" DevicePath \"\"" Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.067080 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.072190 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-99q5b" Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.146570 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:05 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:05 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:05 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.146734 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.443930 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.445575 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk" event={"ID":"380763b9-fdb6-4b62-a8e0-c775708be101","Type":"ContainerDied","Data":"611019d2cc35e40e4fb270f9a5240aff221d1a7a6f93bdb5e8960efe4152ae6e"} Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.445625 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="611019d2cc35e40e4fb270f9a5240aff221d1a7a6f93bdb5e8960efe4152ae6e" Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.453926 4835 generic.go:334] "Generic (PLEG): container finished" podID="e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4" containerID="f22fec6c7da3b0a438c9c47a5b440b0321e876c13f9330b0b97a5596d4c72138" exitCode=0 Oct 02 10:58:05 crc kubenswrapper[4835]: I1002 10:58:05.453975 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4","Type":"ContainerDied","Data":"f22fec6c7da3b0a438c9c47a5b440b0321e876c13f9330b0b97a5596d4c72138"} Oct 02 10:58:06 crc kubenswrapper[4835]: I1002 10:58:06.128675 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:06 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:06 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:06 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:06 crc kubenswrapper[4835]: I1002 10:58:06.130230 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:07 crc kubenswrapper[4835]: I1002 10:58:07.129324 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:07 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:07 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:07 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:07 crc kubenswrapper[4835]: I1002 10:58:07.129413 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:07 crc kubenswrapper[4835]: I1002 10:58:07.383109 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:58:07 crc kubenswrapper[4835]: I1002 10:58:07.390279 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fddaac1-5041-411a-8aed-e7337c06713f-metrics-certs\") pod \"network-metrics-daemon-5j5j6\" (UID: \"7fddaac1-5041-411a-8aed-e7337c06713f\") " pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:58:07 crc kubenswrapper[4835]: I1002 10:58:07.466997 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j5j6" Oct 02 10:58:08 crc kubenswrapper[4835]: I1002 10:58:08.127505 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:08 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:08 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:08 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:08 crc kubenswrapper[4835]: I1002 10:58:08.127945 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:08 crc kubenswrapper[4835]: I1002 10:58:08.248119 4835 patch_prober.go:28] interesting pod/console-f9d7485db-rzxmz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 02 10:58:08 crc kubenswrapper[4835]: I1002 10:58:08.248177 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rzxmz" podUID="9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 02 10:58:09 crc kubenswrapper[4835]: I1002 10:58:09.139705 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:09 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:09 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:09 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:09 crc kubenswrapper[4835]: I1002 10:58:09.139765 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.126194 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:10 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:10 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:10 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.126297 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.220116 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.220247 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.220789 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.220818 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.494164 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4","Type":"ContainerDied","Data":"7cb4379c6e592c1ded26c8d018e626c3117bdfd45d313a6814a8d9671299d9c0"} Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.494218 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cb4379c6e592c1ded26c8d018e626c3117bdfd45d313a6814a8d9671299d9c0" Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.523772 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.641912 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kubelet-dir\") pod \"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4\" (UID: \"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4\") " Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.642059 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kube-api-access\") pod \"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4\" (UID: \"e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4\") " Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.642390 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4" (UID: "e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.648469 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4" (UID: "e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.743787 4835 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 10:58:10 crc kubenswrapper[4835]: I1002 10:58:10.744185 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:58:11 crc kubenswrapper[4835]: I1002 10:58:11.128639 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:11 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:11 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:11 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:11 crc kubenswrapper[4835]: I1002 10:58:11.128747 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:11 crc kubenswrapper[4835]: I1002 10:58:11.500330 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:58:11 crc kubenswrapper[4835]: I1002 10:58:11.984312 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 10:58:11 crc kubenswrapper[4835]: I1002 10:58:11.984390 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 10:58:12 crc kubenswrapper[4835]: I1002 10:58:12.127033 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:12 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:12 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:12 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:12 crc kubenswrapper[4835]: I1002 10:58:12.127130 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:13 crc kubenswrapper[4835]: I1002 10:58:13.127440 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:13 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 02 10:58:13 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:13 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:13 crc kubenswrapper[4835]: I1002 10:58:13.127573 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:14 crc kubenswrapper[4835]: I1002 10:58:14.127156 4835 patch_prober.go:28] interesting pod/router-default-5444994796-vpsrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:58:14 crc kubenswrapper[4835]: [+]has-synced ok Oct 02 10:58:14 crc kubenswrapper[4835]: [+]process-running ok Oct 02 10:58:14 crc kubenswrapper[4835]: healthz check failed Oct 02 10:58:14 crc kubenswrapper[4835]: I1002 10:58:14.127267 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vpsrn" podUID="ea6c1cb7-07e8-4a81-9240-5ade44f372cf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:58:15 crc kubenswrapper[4835]: I1002 10:58:15.128696 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:58:15 crc kubenswrapper[4835]: I1002 10:58:15.131709 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-vpsrn" Oct 02 10:58:18 crc kubenswrapper[4835]: I1002 10:58:18.262277 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:58:18 crc kubenswrapper[4835]: I1002 10:58:18.267112 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 10:58:20 crc kubenswrapper[4835]: I1002 10:58:20.218448 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:20 crc kubenswrapper[4835]: I1002 10:58:20.218786 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:20 crc kubenswrapper[4835]: I1002 10:58:20.218630 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:20 crc kubenswrapper[4835]: I1002 10:58:20.218933 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:20 crc kubenswrapper[4835]: I1002 10:58:20.218847 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-6xvln" Oct 02 10:58:20 crc kubenswrapper[4835]: I1002 10:58:20.219582 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:20 crc kubenswrapper[4835]: I1002 10:58:20.219673 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:20 crc kubenswrapper[4835]: I1002 10:58:20.219792 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"90b3fd38543ee1d2e6391bfe7cef5f93deb69d3dd90d31019934609b0a1597b6"} pod="openshift-console/downloads-7954f5f757-6xvln" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 02 10:58:20 crc kubenswrapper[4835]: I1002 10:58:20.219904 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" containerID="cri-o://90b3fd38543ee1d2e6391bfe7cef5f93deb69d3dd90d31019934609b0a1597b6" gracePeriod=2 Oct 02 10:58:20 crc kubenswrapper[4835]: I1002 10:58:20.891582 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 10:58:21 crc kubenswrapper[4835]: I1002 10:58:21.569014 4835 generic.go:334] "Generic (PLEG): container finished" podID="495e3390-c592-4669-b313-ca2d397746f7" containerID="90b3fd38543ee1d2e6391bfe7cef5f93deb69d3dd90d31019934609b0a1597b6" exitCode=0 Oct 02 10:58:21 crc kubenswrapper[4835]: I1002 10:58:21.569109 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6xvln" event={"ID":"495e3390-c592-4669-b313-ca2d397746f7","Type":"ContainerDied","Data":"90b3fd38543ee1d2e6391bfe7cef5f93deb69d3dd90d31019934609b0a1597b6"} Oct 02 10:58:25 crc kubenswrapper[4835]: E1002 10:58:25.734977 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 10:58:25 crc kubenswrapper[4835]: E1002 10:58:25.735881 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lhbxv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8qvnb_openshift-marketplace(038c63fa-b6fc-4725-9f58-e9c3bfe9595d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:58:25 crc kubenswrapper[4835]: E1002 10:58:25.737442 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8qvnb" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" Oct 02 10:58:28 crc kubenswrapper[4835]: E1002 10:58:28.447438 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8qvnb" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" Oct 02 10:58:28 crc kubenswrapper[4835]: I1002 10:58:28.866523 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5j5j6"] Oct 02 10:58:28 crc kubenswrapper[4835]: W1002 10:58:28.914778 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fddaac1_5041_411a_8aed_e7337c06713f.slice/crio-ef12a5fed19cf09bc67975eca79c19b6568ef1843f11fe7b481938c93553f007 WatchSource:0}: Error finding container ef12a5fed19cf09bc67975eca79c19b6568ef1843f11fe7b481938c93553f007: Status 404 returned error can't find the container with id ef12a5fed19cf09bc67975eca79c19b6568ef1843f11fe7b481938c93553f007 Oct 02 10:58:28 crc kubenswrapper[4835]: E1002 10:58:28.948833 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 10:58:28 crc kubenswrapper[4835]: E1002 10:58:28.949115 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-52bh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p72fs_openshift-marketplace(c296c7cb-3e59-473c-af61-8c1bdc9366dc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:58:28 crc kubenswrapper[4835]: E1002 10:58:28.950333 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p72fs" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" Oct 02 10:58:29 crc kubenswrapper[4835]: I1002 10:58:29.620359 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" event={"ID":"7fddaac1-5041-411a-8aed-e7337c06713f","Type":"ContainerStarted","Data":"ef12a5fed19cf09bc67975eca79c19b6568ef1843f11fe7b481938c93553f007"} Oct 02 10:58:29 crc kubenswrapper[4835]: E1002 10:58:29.623084 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p72fs" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" Oct 02 10:58:30 crc kubenswrapper[4835]: I1002 10:58:30.218617 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:30 crc kubenswrapper[4835]: I1002 10:58:30.218701 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:30 crc kubenswrapper[4835]: I1002 10:58:30.250133 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xn57b" Oct 02 10:58:30 crc kubenswrapper[4835]: I1002 10:58:30.628076 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6xvln" event={"ID":"495e3390-c592-4669-b313-ca2d397746f7","Type":"ContainerStarted","Data":"32c1b754d79b9caeed0b8d8fb97c59725107e764c2f862f1df8aa8e3f6955b34"} Oct 02 10:58:31 crc kubenswrapper[4835]: E1002 10:58:31.268640 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 10:58:31 crc kubenswrapper[4835]: E1002 10:58:31.269380 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pt2kj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nqdkv_openshift-marketplace(2557fb3c-0a66-47fb-95b2-64e41d22a740): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:58:31 crc kubenswrapper[4835]: E1002 10:58:31.272015 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nqdkv" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" Oct 02 10:58:31 crc kubenswrapper[4835]: I1002 10:58:31.637621 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" event={"ID":"7fddaac1-5041-411a-8aed-e7337c06713f","Type":"ContainerStarted","Data":"497e7d3369dd1387099195df716f4e7d075e731d600339b30fbd519c5808e383"} Oct 02 10:58:31 crc kubenswrapper[4835]: I1002 10:58:31.638452 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:31 crc kubenswrapper[4835]: I1002 10:58:31.638491 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:31 crc kubenswrapper[4835]: I1002 10:58:31.638680 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6xvln" Oct 02 10:58:31 crc kubenswrapper[4835]: E1002 10:58:31.638832 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nqdkv" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" Oct 02 10:58:32 crc kubenswrapper[4835]: I1002 10:58:32.402134 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:58:32 crc kubenswrapper[4835]: I1002 10:58:32.644936 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5j5j6" event={"ID":"7fddaac1-5041-411a-8aed-e7337c06713f","Type":"ContainerStarted","Data":"01510e929b81bac277a0109c31b1239586ea5e7d6e989e7abbd70d957dd1cf38"} Oct 02 10:58:32 crc kubenswrapper[4835]: I1002 10:58:32.645936 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:32 crc kubenswrapper[4835]: I1002 10:58:32.646051 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:32 crc kubenswrapper[4835]: I1002 10:58:32.665442 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5j5j6" podStartSLOduration=167.665416828 podStartE2EDuration="2m47.665416828s" podCreationTimestamp="2025-10-02 10:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:58:32.662453949 +0000 UTC m=+189.222361560" watchObservedRunningTime="2025-10-02 10:58:32.665416828 +0000 UTC m=+189.225324409" Oct 02 10:58:33 crc kubenswrapper[4835]: E1002 10:58:33.255576 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 10:58:33 crc kubenswrapper[4835]: E1002 10:58:33.255812 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zk4hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9gw42_openshift-marketplace(0927eb79-581a-448d-abb3-8c785e24274a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:58:33 crc kubenswrapper[4835]: E1002 10:58:33.257060 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9gw42" podUID="0927eb79-581a-448d-abb3-8c785e24274a" Oct 02 10:58:37 crc kubenswrapper[4835]: E1002 10:58:37.311855 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1\": context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 10:58:37 crc kubenswrapper[4835]: E1002 10:58:37.312481 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sqjn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hsq2k_openshift-marketplace(db1c8609-9e4d-49d2-9ffe-15075be7ada6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1\": context canceled" logger="UnhandledError" Oct 02 10:58:37 crc kubenswrapper[4835]: E1002 10:58:37.313734 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1: Get \\\"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1\\\": context canceled\"" pod="openshift-marketplace/redhat-marketplace-hsq2k" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" Oct 02 10:58:40 crc kubenswrapper[4835]: I1002 10:58:40.218650 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:40 crc kubenswrapper[4835]: I1002 10:58:40.219084 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:40 crc kubenswrapper[4835]: I1002 10:58:40.218804 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xvln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 02 10:58:40 crc kubenswrapper[4835]: I1002 10:58:40.219468 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xvln" podUID="495e3390-c592-4669-b313-ca2d397746f7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 02 10:58:41 crc kubenswrapper[4835]: I1002 10:58:41.984290 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 10:58:41 crc kubenswrapper[4835]: I1002 10:58:41.984369 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 10:58:43 crc kubenswrapper[4835]: E1002 10:58:43.093079 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1\": context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 10:58:43 crc kubenswrapper[4835]: E1002 10:58:43.093352 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkll7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6clfp_openshift-marketplace(33d53270-0571-4a70-8b20-3b3f181e1a5c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1\": context canceled" logger="UnhandledError" Oct 02 10:58:43 crc kubenswrapper[4835]: E1002 10:58:43.094578 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1: Get \\\"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:4fcf0426ebc05442a4c6e577d2e4f80bebb28f88fd9b27d7c57520dcd918bed1\\\": context canceled\"" pod="openshift-marketplace/redhat-marketplace-6clfp" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" Oct 02 10:58:50 crc kubenswrapper[4835]: I1002 10:58:50.226153 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-6xvln" Oct 02 10:59:03 crc kubenswrapper[4835]: E1002 10:59:03.115153 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 10:59:03 crc kubenswrapper[4835]: E1002 10:59:03.116112 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7f8gd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5fdd7_openshift-marketplace(9ce2263d-d93b-48cb-b9a5-ec10967c9730): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:59:03 crc kubenswrapper[4835]: E1002 10:59:03.117303 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5fdd7" podUID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" Oct 02 10:59:04 crc kubenswrapper[4835]: E1002 10:59:04.522364 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 10:59:04 crc kubenswrapper[4835]: E1002 10:59:04.522576 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7jgn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fbwpk_openshift-marketplace(6cbd9159-5af9-4954-a9d6-29e3abb32763): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:59:04 crc kubenswrapper[4835]: E1002 10:59:04.524173 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fbwpk" podUID="6cbd9159-5af9-4954-a9d6-29e3abb32763" Oct 02 10:59:11 crc kubenswrapper[4835]: I1002 10:59:11.984953 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 10:59:11 crc kubenswrapper[4835]: I1002 10:59:11.985383 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 10:59:11 crc kubenswrapper[4835]: I1002 10:59:11.985468 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 10:59:11 crc kubenswrapper[4835]: I1002 10:59:11.986192 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 10:59:11 crc kubenswrapper[4835]: I1002 10:59:11.986314 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f" gracePeriod=600 Oct 02 10:59:18 crc kubenswrapper[4835]: I1002 10:59:18.944143 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f" exitCode=0 Oct 02 10:59:18 crc kubenswrapper[4835]: I1002 10:59:18.944279 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f"} Oct 02 10:59:49 crc kubenswrapper[4835]: E1002 10:59:49.780571 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 10:59:49 crc kubenswrapper[4835]: E1002 10:59:49.781410 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sqjn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hsq2k_openshift-marketplace(db1c8609-9e4d-49d2-9ffe-15075be7ada6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:59:49 crc kubenswrapper[4835]: E1002 10:59:49.784344 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hsq2k" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" Oct 02 10:59:49 crc kubenswrapper[4835]: E1002 10:59:49.793471 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 10:59:49 crc kubenswrapper[4835]: E1002 10:59:49.793681 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkll7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6clfp_openshift-marketplace(33d53270-0571-4a70-8b20-3b3f181e1a5c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:59:49 crc kubenswrapper[4835]: E1002 10:59:49.795104 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6clfp" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" Oct 02 10:59:50 crc kubenswrapper[4835]: I1002 10:59:50.178006 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gw42" event={"ID":"0927eb79-581a-448d-abb3-8c785e24274a","Type":"ContainerStarted","Data":"c16dd3fbdba4c872b938d009bce06917ddcfc7ba72793a9bf113dcfdeb6a6966"} Oct 02 10:59:50 crc kubenswrapper[4835]: I1002 10:59:50.183170 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qvnb" event={"ID":"038c63fa-b6fc-4725-9f58-e9c3bfe9595d","Type":"ContainerStarted","Data":"246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0"} Oct 02 10:59:50 crc kubenswrapper[4835]: I1002 10:59:50.185343 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p72fs" event={"ID":"c296c7cb-3e59-473c-af61-8c1bdc9366dc","Type":"ContainerStarted","Data":"8538ae54a15dc6a9b1b6c24ad2eca35a6382a241f4e42d8f2ac27bf10cda6ddc"} Oct 02 10:59:50 crc kubenswrapper[4835]: I1002 10:59:50.188837 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdkv" event={"ID":"2557fb3c-0a66-47fb-95b2-64e41d22a740","Type":"ContainerStarted","Data":"51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6"} Oct 02 10:59:50 crc kubenswrapper[4835]: I1002 10:59:50.192240 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"9c5852139e1391baa79b13c185dfc951b255ae7419577505db83d6247b44ad1d"} Oct 02 10:59:50 crc kubenswrapper[4835]: E1002 10:59:50.194074 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6clfp" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" Oct 02 10:59:50 crc kubenswrapper[4835]: E1002 10:59:50.194792 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hsq2k" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" Oct 02 10:59:51 crc kubenswrapper[4835]: I1002 10:59:51.200242 4835 generic.go:334] "Generic (PLEG): container finished" podID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerID="51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6" exitCode=0 Oct 02 10:59:51 crc kubenswrapper[4835]: I1002 10:59:51.200654 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdkv" event={"ID":"2557fb3c-0a66-47fb-95b2-64e41d22a740","Type":"ContainerDied","Data":"51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6"} Oct 02 10:59:51 crc kubenswrapper[4835]: I1002 10:59:51.203754 4835 generic.go:334] "Generic (PLEG): container finished" podID="0927eb79-581a-448d-abb3-8c785e24274a" containerID="c16dd3fbdba4c872b938d009bce06917ddcfc7ba72793a9bf113dcfdeb6a6966" exitCode=0 Oct 02 10:59:51 crc kubenswrapper[4835]: I1002 10:59:51.203818 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gw42" event={"ID":"0927eb79-581a-448d-abb3-8c785e24274a","Type":"ContainerDied","Data":"c16dd3fbdba4c872b938d009bce06917ddcfc7ba72793a9bf113dcfdeb6a6966"} Oct 02 10:59:51 crc kubenswrapper[4835]: I1002 10:59:51.206946 4835 generic.go:334] "Generic (PLEG): container finished" podID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerID="246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0" exitCode=0 Oct 02 10:59:51 crc kubenswrapper[4835]: I1002 10:59:51.207040 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qvnb" event={"ID":"038c63fa-b6fc-4725-9f58-e9c3bfe9595d","Type":"ContainerDied","Data":"246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0"} Oct 02 10:59:51 crc kubenswrapper[4835]: I1002 10:59:51.209382 4835 generic.go:334] "Generic (PLEG): container finished" podID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerID="8538ae54a15dc6a9b1b6c24ad2eca35a6382a241f4e42d8f2ac27bf10cda6ddc" exitCode=0 Oct 02 10:59:51 crc kubenswrapper[4835]: I1002 10:59:51.209458 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p72fs" event={"ID":"c296c7cb-3e59-473c-af61-8c1bdc9366dc","Type":"ContainerDied","Data":"8538ae54a15dc6a9b1b6c24ad2eca35a6382a241f4e42d8f2ac27bf10cda6ddc"} Oct 02 10:59:52 crc kubenswrapper[4835]: I1002 10:59:52.219818 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdd7" event={"ID":"9ce2263d-d93b-48cb-b9a5-ec10967c9730","Type":"ContainerStarted","Data":"385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff"} Oct 02 10:59:52 crc kubenswrapper[4835]: I1002 10:59:52.223919 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbwpk" event={"ID":"6cbd9159-5af9-4954-a9d6-29e3abb32763","Type":"ContainerStarted","Data":"d62af045b2fc32decc288cdfbf975419574eb0f3580eff451bfa8db05f722acb"} Oct 02 10:59:53 crc kubenswrapper[4835]: I1002 10:59:53.233172 4835 generic.go:334] "Generic (PLEG): container finished" podID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" containerID="385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff" exitCode=0 Oct 02 10:59:53 crc kubenswrapper[4835]: I1002 10:59:53.233264 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdd7" event={"ID":"9ce2263d-d93b-48cb-b9a5-ec10967c9730","Type":"ContainerDied","Data":"385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff"} Oct 02 10:59:53 crc kubenswrapper[4835]: I1002 10:59:53.235786 4835 generic.go:334] "Generic (PLEG): container finished" podID="6cbd9159-5af9-4954-a9d6-29e3abb32763" containerID="d62af045b2fc32decc288cdfbf975419574eb0f3580eff451bfa8db05f722acb" exitCode=0 Oct 02 10:59:53 crc kubenswrapper[4835]: I1002 10:59:53.235880 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbwpk" event={"ID":"6cbd9159-5af9-4954-a9d6-29e3abb32763","Type":"ContainerDied","Data":"d62af045b2fc32decc288cdfbf975419574eb0f3580eff451bfa8db05f722acb"} Oct 02 10:59:53 crc kubenswrapper[4835]: I1002 10:59:53.237990 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p72fs" event={"ID":"c296c7cb-3e59-473c-af61-8c1bdc9366dc","Type":"ContainerStarted","Data":"5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4"} Oct 02 10:59:53 crc kubenswrapper[4835]: I1002 10:59:53.295918 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p72fs" podStartSLOduration=2.809629711 podStartE2EDuration="1m55.295892176s" podCreationTimestamp="2025-10-02 10:57:58 +0000 UTC" firstStartedPulling="2025-10-02 10:58:00.172474396 +0000 UTC m=+156.732381977" lastFinishedPulling="2025-10-02 10:59:52.658736861 +0000 UTC m=+269.218644442" observedRunningTime="2025-10-02 10:59:53.295274537 +0000 UTC m=+269.855182148" watchObservedRunningTime="2025-10-02 10:59:53.295892176 +0000 UTC m=+269.855799757" Oct 02 10:59:55 crc kubenswrapper[4835]: I1002 10:59:55.252507 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gw42" event={"ID":"0927eb79-581a-448d-abb3-8c785e24274a","Type":"ContainerStarted","Data":"9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17"} Oct 02 10:59:59 crc kubenswrapper[4835]: I1002 10:59:58.999677 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:59:59 crc kubenswrapper[4835]: I1002 10:59:59.000350 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p72fs" Oct 02 10:59:59 crc kubenswrapper[4835]: I1002 10:59:59.063100 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9gw42" Oct 02 10:59:59 crc kubenswrapper[4835]: I1002 10:59:59.063173 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9gw42" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.147917 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9gw42" podStartSLOduration=9.093811784 podStartE2EDuration="2m2.147892367s" podCreationTimestamp="2025-10-02 10:57:58 +0000 UTC" firstStartedPulling="2025-10-02 10:58:00.194378042 +0000 UTC m=+156.754285623" lastFinishedPulling="2025-10-02 10:59:53.248458625 +0000 UTC m=+269.808366206" observedRunningTime="2025-10-02 10:59:56.290012683 +0000 UTC m=+272.849920284" watchObservedRunningTime="2025-10-02 11:00:00.147892367 +0000 UTC m=+276.707799948" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.148505 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k"] Oct 02 11:00:00 crc kubenswrapper[4835]: E1002 11:00:00.148751 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="380763b9-fdb6-4b62-a8e0-c775708be101" containerName="collect-profiles" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.148764 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="380763b9-fdb6-4b62-a8e0-c775708be101" containerName="collect-profiles" Oct 02 11:00:00 crc kubenswrapper[4835]: E1002 11:00:00.148782 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4" containerName="pruner" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.148790 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4" containerName="pruner" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.148907 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="380763b9-fdb6-4b62-a8e0-c775708be101" containerName="collect-profiles" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.148919 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6aac55c-74c7-4ba5-8f51-e15fc4d1fbb4" containerName="pruner" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.149377 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.152847 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.152892 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.165206 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k"] Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.262808 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-secret-volume\") pod \"collect-profiles-29323380-87m6k\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.262874 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-config-volume\") pod \"collect-profiles-29323380-87m6k\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.262986 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bscqb\" (UniqueName: \"kubernetes.io/projected/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-kube-api-access-bscqb\") pod \"collect-profiles-29323380-87m6k\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.293054 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qvnb" event={"ID":"038c63fa-b6fc-4725-9f58-e9c3bfe9595d","Type":"ContainerStarted","Data":"8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735"} Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.340723 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9gw42" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.342254 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p72fs" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.369049 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-secret-volume\") pod \"collect-profiles-29323380-87m6k\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.369105 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-config-volume\") pod \"collect-profiles-29323380-87m6k\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.369161 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bscqb\" (UniqueName: \"kubernetes.io/projected/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-kube-api-access-bscqb\") pod \"collect-profiles-29323380-87m6k\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.372757 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-config-volume\") pod \"collect-profiles-29323380-87m6k\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.377668 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-secret-volume\") pod \"collect-profiles-29323380-87m6k\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.398140 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bscqb\" (UniqueName: \"kubernetes.io/projected/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-kube-api-access-bscqb\") pod \"collect-profiles-29323380-87m6k\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.404346 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9gw42" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.417605 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p72fs" Oct 02 11:00:00 crc kubenswrapper[4835]: I1002 11:00:00.469657 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:01 crc kubenswrapper[4835]: I1002 11:00:01.326490 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8qvnb" podStartSLOduration=5.725664977 podStartE2EDuration="2m3.326459597s" podCreationTimestamp="2025-10-02 10:57:58 +0000 UTC" firstStartedPulling="2025-10-02 10:58:00.169657132 +0000 UTC m=+156.729564713" lastFinishedPulling="2025-10-02 10:59:57.770451752 +0000 UTC m=+274.330359333" observedRunningTime="2025-10-02 11:00:01.324191778 +0000 UTC m=+277.884099369" watchObservedRunningTime="2025-10-02 11:00:01.326459597 +0000 UTC m=+277.886367178" Oct 02 11:00:02 crc kubenswrapper[4835]: I1002 11:00:02.628590 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gw42"] Oct 02 11:00:02 crc kubenswrapper[4835]: I1002 11:00:02.629387 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9gw42" podUID="0927eb79-581a-448d-abb3-8c785e24274a" containerName="registry-server" containerID="cri-o://9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17" gracePeriod=2 Oct 02 11:00:03 crc kubenswrapper[4835]: E1002 11:00:03.224246 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6clfp" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:04.425122 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p72fs"] Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:04.425759 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p72fs" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerName="registry-server" containerID="cri-o://5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4" gracePeriod=2 Oct 02 11:00:16 crc kubenswrapper[4835]: E1002 11:00:05.658629 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hsq2k" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:06.337735 4835 generic.go:334] "Generic (PLEG): container finished" podID="0927eb79-581a-448d-abb3-8c785e24274a" containerID="9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17" exitCode=0 Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:06.337809 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gw42" event={"ID":"0927eb79-581a-448d-abb3-8c785e24274a","Type":"ContainerDied","Data":"9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17"} Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:07.348172 4835 generic.go:334] "Generic (PLEG): container finished" podID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerID="5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4" exitCode=0 Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:07.348256 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p72fs" event={"ID":"c296c7cb-3e59-473c-af61-8c1bdc9366dc","Type":"ContainerDied","Data":"5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4"} Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:08.536530 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:08.536850 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:08.576923 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 11:00:16 crc kubenswrapper[4835]: E1002 11:00:09.000606 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4 is running failed: container process not found" containerID="5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:00:16 crc kubenswrapper[4835]: E1002 11:00:09.001273 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4 is running failed: container process not found" containerID="5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:00:16 crc kubenswrapper[4835]: E1002 11:00:09.001738 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4 is running failed: container process not found" containerID="5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:00:16 crc kubenswrapper[4835]: E1002 11:00:09.001763 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-p72fs" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerName="registry-server" Oct 02 11:00:16 crc kubenswrapper[4835]: E1002 11:00:09.063349 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17 is running failed: container process not found" containerID="9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:00:16 crc kubenswrapper[4835]: E1002 11:00:09.063938 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17 is running failed: container process not found" containerID="9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:00:16 crc kubenswrapper[4835]: E1002 11:00:09.064286 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17 is running failed: container process not found" containerID="9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:00:16 crc kubenswrapper[4835]: E1002 11:00:09.064316 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-9gw42" podUID="0927eb79-581a-448d-abb3-8c785e24274a" containerName="registry-server" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:09.403185 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:13.803049 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gw42" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:13.867456 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-utilities\") pod \"0927eb79-581a-448d-abb3-8c785e24274a\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:13.867528 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-catalog-content\") pod \"0927eb79-581a-448d-abb3-8c785e24274a\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:13.867625 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk4hl\" (UniqueName: \"kubernetes.io/projected/0927eb79-581a-448d-abb3-8c785e24274a-kube-api-access-zk4hl\") pod \"0927eb79-581a-448d-abb3-8c785e24274a\" (UID: \"0927eb79-581a-448d-abb3-8c785e24274a\") " Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:13.868630 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-utilities" (OuterVolumeSpecName: "utilities") pod "0927eb79-581a-448d-abb3-8c785e24274a" (UID: "0927eb79-581a-448d-abb3-8c785e24274a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:13.873655 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0927eb79-581a-448d-abb3-8c785e24274a-kube-api-access-zk4hl" (OuterVolumeSpecName: "kube-api-access-zk4hl") pod "0927eb79-581a-448d-abb3-8c785e24274a" (UID: "0927eb79-581a-448d-abb3-8c785e24274a"). InnerVolumeSpecName "kube-api-access-zk4hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:13.922912 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0927eb79-581a-448d-abb3-8c785e24274a" (UID: "0927eb79-581a-448d-abb3-8c785e24274a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:13.968648 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk4hl\" (UniqueName: \"kubernetes.io/projected/0927eb79-581a-448d-abb3-8c785e24274a-kube-api-access-zk4hl\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:13.968682 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:13.968696 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0927eb79-581a-448d-abb3-8c785e24274a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:14.391311 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gw42" event={"ID":"0927eb79-581a-448d-abb3-8c785e24274a","Type":"ContainerDied","Data":"feb7641ddd845c62154f1d20b19418aafa063fd10ffa2f974b621b52e7834451"} Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:14.391363 4835 scope.go:117] "RemoveContainer" containerID="9c9fc10db2aff6472471ff1be20651ed318534903ea1c3335752d51f2ee05a17" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:14.391365 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gw42" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:14.406992 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gw42"] Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:14.416041 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9gw42"] Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:16.192578 4835 scope.go:117] "RemoveContainer" containerID="c16dd3fbdba4c872b938d009bce06917ddcfc7ba72793a9bf113dcfdeb6a6966" Oct 02 11:00:16 crc kubenswrapper[4835]: I1002 11:00:16.258081 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0927eb79-581a-448d-abb3-8c785e24274a" path="/var/lib/kubelet/pods/0927eb79-581a-448d-abb3-8c785e24274a/volumes" Oct 02 11:00:19 crc kubenswrapper[4835]: E1002 11:00:19.000151 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4 is running failed: container process not found" containerID="5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:00:19 crc kubenswrapper[4835]: E1002 11:00:19.001349 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4 is running failed: container process not found" containerID="5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:00:19 crc kubenswrapper[4835]: E1002 11:00:19.001985 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4 is running failed: container process not found" containerID="5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:00:19 crc kubenswrapper[4835]: E1002 11:00:19.002011 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-p72fs" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerName="registry-server" Oct 02 11:00:21 crc kubenswrapper[4835]: I1002 11:00:21.681054 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k"] Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.214741 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p72fs" Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.414347 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-catalog-content\") pod \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.414467 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-utilities\") pod \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.414614 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52bh8\" (UniqueName: \"kubernetes.io/projected/c296c7cb-3e59-473c-af61-8c1bdc9366dc-kube-api-access-52bh8\") pod \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\" (UID: \"c296c7cb-3e59-473c-af61-8c1bdc9366dc\") " Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.415171 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-utilities" (OuterVolumeSpecName: "utilities") pod "c296c7cb-3e59-473c-af61-8c1bdc9366dc" (UID: "c296c7cb-3e59-473c-af61-8c1bdc9366dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.421565 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c296c7cb-3e59-473c-af61-8c1bdc9366dc-kube-api-access-52bh8" (OuterVolumeSpecName: "kube-api-access-52bh8") pod "c296c7cb-3e59-473c-af61-8c1bdc9366dc" (UID: "c296c7cb-3e59-473c-af61-8c1bdc9366dc"). InnerVolumeSpecName "kube-api-access-52bh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.451953 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p72fs" event={"ID":"c296c7cb-3e59-473c-af61-8c1bdc9366dc","Type":"ContainerDied","Data":"27d496e9ee0f5481c60ec8658808663254ca9abb462034ee2c2084f05b0139df"} Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.452039 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p72fs" Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.452834 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" event={"ID":"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe","Type":"ContainerStarted","Data":"57610f20c88d2abce33b097cedb0b8fb8a9e2d386e0bcd07aee87fc5b3687f71"} Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.516543 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52bh8\" (UniqueName: \"kubernetes.io/projected/c296c7cb-3e59-473c-af61-8c1bdc9366dc-kube-api-access-52bh8\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.516595 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:23 crc kubenswrapper[4835]: I1002 11:00:23.962491 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c296c7cb-3e59-473c-af61-8c1bdc9366dc" (UID: "c296c7cb-3e59-473c-af61-8c1bdc9366dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:00:24 crc kubenswrapper[4835]: I1002 11:00:24.024192 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c296c7cb-3e59-473c-af61-8c1bdc9366dc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:24 crc kubenswrapper[4835]: I1002 11:00:24.111146 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p72fs"] Oct 02 11:00:24 crc kubenswrapper[4835]: I1002 11:00:24.125969 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p72fs"] Oct 02 11:00:24 crc kubenswrapper[4835]: I1002 11:00:24.258963 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" path="/var/lib/kubelet/pods/c296c7cb-3e59-473c-af61-8c1bdc9366dc/volumes" Oct 02 11:00:24 crc kubenswrapper[4835]: I1002 11:00:24.281883 4835 scope.go:117] "RemoveContainer" containerID="12901a1f9c85ea6eab2198dabc18084e62ca6e36602e346612080cb79dc0c196" Oct 02 11:00:24 crc kubenswrapper[4835]: I1002 11:00:24.506059 4835 scope.go:117] "RemoveContainer" containerID="5681246a1ad2e69ec118ffc88dfaabd2036b0b23b2e77fec4b55a02e72ff2bc4" Oct 02 11:00:24 crc kubenswrapper[4835]: I1002 11:00:24.539977 4835 scope.go:117] "RemoveContainer" containerID="8538ae54a15dc6a9b1b6c24ad2eca35a6382a241f4e42d8f2ac27bf10cda6ddc" Oct 02 11:00:24 crc kubenswrapper[4835]: I1002 11:00:24.597914 4835 scope.go:117] "RemoveContainer" containerID="51747b90f271788965e22bf2f4ed6fb17f16eec5188caf44ae8c885d7acb7ca3" Oct 02 11:00:26 crc kubenswrapper[4835]: I1002 11:00:26.526930 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdkv" event={"ID":"2557fb3c-0a66-47fb-95b2-64e41d22a740","Type":"ContainerStarted","Data":"f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61"} Oct 02 11:00:27 crc kubenswrapper[4835]: I1002 11:00:27.533292 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbwpk" event={"ID":"6cbd9159-5af9-4954-a9d6-29e3abb32763","Type":"ContainerStarted","Data":"3b1a9622e7f0f1a79d9f9f9c6f90774f4deb9e801138c812d2268c709f7782ce"} Oct 02 11:00:27 crc kubenswrapper[4835]: I1002 11:00:27.534500 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" event={"ID":"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe","Type":"ContainerStarted","Data":"3c035763cfda7c9e4888012d3918bb05d86aa1ed21793a182fb3b06493a03a9a"} Oct 02 11:00:27 crc kubenswrapper[4835]: I1002 11:00:27.536662 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdd7" event={"ID":"9ce2263d-d93b-48cb-b9a5-ec10967c9730","Type":"ContainerStarted","Data":"5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1"} Oct 02 11:00:27 crc kubenswrapper[4835]: I1002 11:00:27.554863 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nqdkv" podStartSLOduration=5.4545000550000005 podStartE2EDuration="2m29.554844629s" podCreationTimestamp="2025-10-02 10:57:58 +0000 UTC" firstStartedPulling="2025-10-02 10:58:00.182199368 +0000 UTC m=+156.742106949" lastFinishedPulling="2025-10-02 11:00:24.282543942 +0000 UTC m=+300.842451523" observedRunningTime="2025-10-02 11:00:27.551852228 +0000 UTC m=+304.111759819" watchObservedRunningTime="2025-10-02 11:00:27.554844629 +0000 UTC m=+304.114752210" Oct 02 11:00:28 crc kubenswrapper[4835]: I1002 11:00:28.543911 4835 generic.go:334] "Generic (PLEG): container finished" podID="b4c401d6-7b1d-40f4-8570-d0ccb5b778fe" containerID="3c035763cfda7c9e4888012d3918bb05d86aa1ed21793a182fb3b06493a03a9a" exitCode=0 Oct 02 11:00:28 crc kubenswrapper[4835]: I1002 11:00:28.543955 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" event={"ID":"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe","Type":"ContainerDied","Data":"3c035763cfda7c9e4888012d3918bb05d86aa1ed21793a182fb3b06493a03a9a"} Oct 02 11:00:28 crc kubenswrapper[4835]: I1002 11:00:28.680019 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nqdkv" Oct 02 11:00:28 crc kubenswrapper[4835]: I1002 11:00:28.680080 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nqdkv" Oct 02 11:00:29 crc kubenswrapper[4835]: I1002 11:00:29.567138 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5fdd7" podStartSLOduration=8.794899082 podStartE2EDuration="2m28.567114663s" podCreationTimestamp="2025-10-02 10:58:01 +0000 UTC" firstStartedPulling="2025-10-02 10:58:03.383384431 +0000 UTC m=+159.943292012" lastFinishedPulling="2025-10-02 11:00:23.155599992 +0000 UTC m=+299.715507593" observedRunningTime="2025-10-02 11:00:29.565425431 +0000 UTC m=+306.125333012" watchObservedRunningTime="2025-10-02 11:00:29.567114663 +0000 UTC m=+306.127022264" Oct 02 11:00:29 crc kubenswrapper[4835]: I1002 11:00:29.609587 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fbwpk" podStartSLOduration=18.256580075 podStartE2EDuration="2m28.609566482s" podCreationTimestamp="2025-10-02 10:58:01 +0000 UTC" firstStartedPulling="2025-10-02 10:58:03.399575726 +0000 UTC m=+159.959483307" lastFinishedPulling="2025-10-02 11:00:13.752562113 +0000 UTC m=+290.312469714" observedRunningTime="2025-10-02 11:00:29.607056066 +0000 UTC m=+306.166963667" watchObservedRunningTime="2025-10-02 11:00:29.609566482 +0000 UTC m=+306.169474063" Oct 02 11:00:29 crc kubenswrapper[4835]: I1002 11:00:29.730427 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-nqdkv" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerName="registry-server" probeResult="failure" output=< Oct 02 11:00:29 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Oct 02 11:00:29 crc kubenswrapper[4835]: > Oct 02 11:00:30 crc kubenswrapper[4835]: I1002 11:00:30.813560 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:30 crc kubenswrapper[4835]: I1002 11:00:30.918023 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-config-volume\") pod \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " Oct 02 11:00:30 crc kubenswrapper[4835]: I1002 11:00:30.918155 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bscqb\" (UniqueName: \"kubernetes.io/projected/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-kube-api-access-bscqb\") pod \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " Oct 02 11:00:30 crc kubenswrapper[4835]: I1002 11:00:30.918183 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-secret-volume\") pod \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\" (UID: \"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe\") " Oct 02 11:00:30 crc kubenswrapper[4835]: I1002 11:00:30.918917 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "b4c401d6-7b1d-40f4-8570-d0ccb5b778fe" (UID: "b4c401d6-7b1d-40f4-8570-d0ccb5b778fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:00:30 crc kubenswrapper[4835]: I1002 11:00:30.924269 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b4c401d6-7b1d-40f4-8570-d0ccb5b778fe" (UID: "b4c401d6-7b1d-40f4-8570-d0ccb5b778fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:00:30 crc kubenswrapper[4835]: I1002 11:00:30.924380 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-kube-api-access-bscqb" (OuterVolumeSpecName: "kube-api-access-bscqb") pod "b4c401d6-7b1d-40f4-8570-d0ccb5b778fe" (UID: "b4c401d6-7b1d-40f4-8570-d0ccb5b778fe"). InnerVolumeSpecName "kube-api-access-bscqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:00:31 crc kubenswrapper[4835]: I1002 11:00:31.020286 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:31 crc kubenswrapper[4835]: I1002 11:00:31.020874 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bscqb\" (UniqueName: \"kubernetes.io/projected/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-kube-api-access-bscqb\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:31 crc kubenswrapper[4835]: I1002 11:00:31.020892 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:31 crc kubenswrapper[4835]: I1002 11:00:31.563216 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" event={"ID":"b4c401d6-7b1d-40f4-8570-d0ccb5b778fe","Type":"ContainerDied","Data":"57610f20c88d2abce33b097cedb0b8fb8a9e2d386e0bcd07aee87fc5b3687f71"} Oct 02 11:00:31 crc kubenswrapper[4835]: I1002 11:00:31.563286 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57610f20c88d2abce33b097cedb0b8fb8a9e2d386e0bcd07aee87fc5b3687f71" Oct 02 11:00:31 crc kubenswrapper[4835]: I1002 11:00:31.563323 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k" Oct 02 11:00:31 crc kubenswrapper[4835]: I1002 11:00:31.819209 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 11:00:31 crc kubenswrapper[4835]: I1002 11:00:31.820259 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 11:00:31 crc kubenswrapper[4835]: I1002 11:00:31.873471 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 11:00:32 crc kubenswrapper[4835]: I1002 11:00:32.066032 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 11:00:32 crc kubenswrapper[4835]: I1002 11:00:32.066092 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 11:00:32 crc kubenswrapper[4835]: I1002 11:00:32.102931 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 11:00:32 crc kubenswrapper[4835]: I1002 11:00:32.608622 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 11:00:32 crc kubenswrapper[4835]: I1002 11:00:32.615630 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 11:00:33 crc kubenswrapper[4835]: I1002 11:00:33.385988 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbwpk"] Oct 02 11:00:34 crc kubenswrapper[4835]: I1002 11:00:34.585160 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fbwpk" podUID="6cbd9159-5af9-4954-a9d6-29e3abb32763" containerName="registry-server" containerID="cri-o://3b1a9622e7f0f1a79d9f9f9c6f90774f4deb9e801138c812d2268c709f7782ce" gracePeriod=2 Oct 02 11:00:35 crc kubenswrapper[4835]: I1002 11:00:35.594104 4835 generic.go:334] "Generic (PLEG): container finished" podID="6cbd9159-5af9-4954-a9d6-29e3abb32763" containerID="3b1a9622e7f0f1a79d9f9f9c6f90774f4deb9e801138c812d2268c709f7782ce" exitCode=0 Oct 02 11:00:35 crc kubenswrapper[4835]: I1002 11:00:35.594170 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbwpk" event={"ID":"6cbd9159-5af9-4954-a9d6-29e3abb32763","Type":"ContainerDied","Data":"3b1a9622e7f0f1a79d9f9f9c6f90774f4deb9e801138c812d2268c709f7782ce"} Oct 02 11:00:35 crc kubenswrapper[4835]: I1002 11:00:35.878694 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.009473 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-catalog-content\") pod \"6cbd9159-5af9-4954-a9d6-29e3abb32763\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.009539 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-utilities\") pod \"6cbd9159-5af9-4954-a9d6-29e3abb32763\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.009595 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jgn6\" (UniqueName: \"kubernetes.io/projected/6cbd9159-5af9-4954-a9d6-29e3abb32763-kube-api-access-7jgn6\") pod \"6cbd9159-5af9-4954-a9d6-29e3abb32763\" (UID: \"6cbd9159-5af9-4954-a9d6-29e3abb32763\") " Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.011619 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-utilities" (OuterVolumeSpecName: "utilities") pod "6cbd9159-5af9-4954-a9d6-29e3abb32763" (UID: "6cbd9159-5af9-4954-a9d6-29e3abb32763"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.018549 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cbd9159-5af9-4954-a9d6-29e3abb32763-kube-api-access-7jgn6" (OuterVolumeSpecName: "kube-api-access-7jgn6") pod "6cbd9159-5af9-4954-a9d6-29e3abb32763" (UID: "6cbd9159-5af9-4954-a9d6-29e3abb32763"). InnerVolumeSpecName "kube-api-access-7jgn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.090067 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cbd9159-5af9-4954-a9d6-29e3abb32763" (UID: "6cbd9159-5af9-4954-a9d6-29e3abb32763"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.111730 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.111827 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbd9159-5af9-4954-a9d6-29e3abb32763-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.111839 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jgn6\" (UniqueName: \"kubernetes.io/projected/6cbd9159-5af9-4954-a9d6-29e3abb32763-kube-api-access-7jgn6\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.605812 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbwpk" event={"ID":"6cbd9159-5af9-4954-a9d6-29e3abb32763","Type":"ContainerDied","Data":"9659b2476b786603e82cfd265ff653e2b9873223d1839219b641dc6efc8017f4"} Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.606212 4835 scope.go:117] "RemoveContainer" containerID="3b1a9622e7f0f1a79d9f9f9c6f90774f4deb9e801138c812d2268c709f7782ce" Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.605960 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbwpk" Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.625567 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbwpk"] Oct 02 11:00:36 crc kubenswrapper[4835]: I1002 11:00:36.629350 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fbwpk"] Oct 02 11:00:37 crc kubenswrapper[4835]: I1002 11:00:37.442137 4835 scope.go:117] "RemoveContainer" containerID="d62af045b2fc32decc288cdfbf975419574eb0f3580eff451bfa8db05f722acb" Oct 02 11:00:37 crc kubenswrapper[4835]: I1002 11:00:37.471423 4835 scope.go:117] "RemoveContainer" containerID="e4054a622a1b04cfea34b409255b7e5cb26bfbd3bf019a4834acb04179b4a270" Oct 02 11:00:38 crc kubenswrapper[4835]: I1002 11:00:38.262845 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cbd9159-5af9-4954-a9d6-29e3abb32763" path="/var/lib/kubelet/pods/6cbd9159-5af9-4954-a9d6-29e3abb32763/volumes" Oct 02 11:00:38 crc kubenswrapper[4835]: I1002 11:00:38.621533 4835 generic.go:334] "Generic (PLEG): container finished" podID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" containerID="b1762d44cdb5d9826fb174f977cc29fcd4fe4bfb7e9839432441d68f0ff7c080" exitCode=0 Oct 02 11:00:38 crc kubenswrapper[4835]: I1002 11:00:38.621631 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hsq2k" event={"ID":"db1c8609-9e4d-49d2-9ffe-15075be7ada6","Type":"ContainerDied","Data":"b1762d44cdb5d9826fb174f977cc29fcd4fe4bfb7e9839432441d68f0ff7c080"} Oct 02 11:00:38 crc kubenswrapper[4835]: I1002 11:00:38.625992 4835 generic.go:334] "Generic (PLEG): container finished" podID="33d53270-0571-4a70-8b20-3b3f181e1a5c" containerID="8ed2db07248a43c5214276c0dee16883c38f34c280c9cb6dfdf896770c7ee773" exitCode=0 Oct 02 11:00:38 crc kubenswrapper[4835]: I1002 11:00:38.626065 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6clfp" event={"ID":"33d53270-0571-4a70-8b20-3b3f181e1a5c","Type":"ContainerDied","Data":"8ed2db07248a43c5214276c0dee16883c38f34c280c9cb6dfdf896770c7ee773"} Oct 02 11:00:38 crc kubenswrapper[4835]: I1002 11:00:38.720278 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nqdkv" Oct 02 11:00:38 crc kubenswrapper[4835]: I1002 11:00:38.761980 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nqdkv" Oct 02 11:00:39 crc kubenswrapper[4835]: I1002 11:00:39.635448 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6clfp" event={"ID":"33d53270-0571-4a70-8b20-3b3f181e1a5c","Type":"ContainerStarted","Data":"bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde"} Oct 02 11:00:39 crc kubenswrapper[4835]: I1002 11:00:39.638908 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hsq2k" event={"ID":"db1c8609-9e4d-49d2-9ffe-15075be7ada6","Type":"ContainerStarted","Data":"3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52"} Oct 02 11:00:39 crc kubenswrapper[4835]: I1002 11:00:39.655558 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6clfp" podStartSLOduration=2.753049174 podStartE2EDuration="2m39.655542361s" podCreationTimestamp="2025-10-02 10:58:00 +0000 UTC" firstStartedPulling="2025-10-02 10:58:02.376596979 +0000 UTC m=+158.936504560" lastFinishedPulling="2025-10-02 11:00:39.279090146 +0000 UTC m=+315.838997747" observedRunningTime="2025-10-02 11:00:39.652838969 +0000 UTC m=+316.212746550" watchObservedRunningTime="2025-10-02 11:00:39.655542361 +0000 UTC m=+316.215449942" Oct 02 11:00:39 crc kubenswrapper[4835]: I1002 11:00:39.676006 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hsq2k" podStartSLOduration=2.6089056409999998 podStartE2EDuration="2m39.675978082s" podCreationTimestamp="2025-10-02 10:58:00 +0000 UTC" firstStartedPulling="2025-10-02 10:58:02.336837867 +0000 UTC m=+158.896745448" lastFinishedPulling="2025-10-02 11:00:39.403910308 +0000 UTC m=+315.963817889" observedRunningTime="2025-10-02 11:00:39.675424575 +0000 UTC m=+316.235332176" watchObservedRunningTime="2025-10-02 11:00:39.675978082 +0000 UTC m=+316.235885673" Oct 02 11:00:40 crc kubenswrapper[4835]: I1002 11:00:40.661248 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 11:00:40 crc kubenswrapper[4835]: I1002 11:00:40.661600 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 11:00:40 crc kubenswrapper[4835]: I1002 11:00:40.709886 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 11:00:41 crc kubenswrapper[4835]: I1002 11:00:41.066869 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 11:00:41 crc kubenswrapper[4835]: I1002 11:00:41.067016 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 11:00:41 crc kubenswrapper[4835]: I1002 11:00:41.111799 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 11:00:50 crc kubenswrapper[4835]: I1002 11:00:50.721077 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 11:00:51 crc kubenswrapper[4835]: I1002 11:00:51.125147 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 11:00:51 crc kubenswrapper[4835]: I1002 11:00:51.172657 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hsq2k"] Oct 02 11:00:51 crc kubenswrapper[4835]: I1002 11:00:51.248946 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lm4nm"] Oct 02 11:00:51 crc kubenswrapper[4835]: I1002 11:00:51.719521 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hsq2k" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" containerName="registry-server" containerID="cri-o://3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52" gracePeriod=2 Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.103311 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.239993 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-utilities\") pod \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.240118 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqjn7\" (UniqueName: \"kubernetes.io/projected/db1c8609-9e4d-49d2-9ffe-15075be7ada6-kube-api-access-sqjn7\") pod \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.240246 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-catalog-content\") pod \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\" (UID: \"db1c8609-9e4d-49d2-9ffe-15075be7ada6\") " Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.240969 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-utilities" (OuterVolumeSpecName: "utilities") pod "db1c8609-9e4d-49d2-9ffe-15075be7ada6" (UID: "db1c8609-9e4d-49d2-9ffe-15075be7ada6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.248716 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1c8609-9e4d-49d2-9ffe-15075be7ada6-kube-api-access-sqjn7" (OuterVolumeSpecName: "kube-api-access-sqjn7") pod "db1c8609-9e4d-49d2-9ffe-15075be7ada6" (UID: "db1c8609-9e4d-49d2-9ffe-15075be7ada6"). InnerVolumeSpecName "kube-api-access-sqjn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.265608 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db1c8609-9e4d-49d2-9ffe-15075be7ada6" (UID: "db1c8609-9e4d-49d2-9ffe-15075be7ada6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.342650 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.342701 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqjn7\" (UniqueName: \"kubernetes.io/projected/db1c8609-9e4d-49d2-9ffe-15075be7ada6-kube-api-access-sqjn7\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.342717 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1c8609-9e4d-49d2-9ffe-15075be7ada6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.727617 4835 generic.go:334] "Generic (PLEG): container finished" podID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" containerID="3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52" exitCode=0 Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.727664 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hsq2k" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.727667 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hsq2k" event={"ID":"db1c8609-9e4d-49d2-9ffe-15075be7ada6","Type":"ContainerDied","Data":"3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52"} Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.727867 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hsq2k" event={"ID":"db1c8609-9e4d-49d2-9ffe-15075be7ada6","Type":"ContainerDied","Data":"bd16cbd381c0d8150b9ba4fbb56b91156681485bd04523a4ce96f2340c969d40"} Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.727986 4835 scope.go:117] "RemoveContainer" containerID="3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.751144 4835 scope.go:117] "RemoveContainer" containerID="b1762d44cdb5d9826fb174f977cc29fcd4fe4bfb7e9839432441d68f0ff7c080" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.765467 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hsq2k"] Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.768342 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hsq2k"] Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.793514 4835 scope.go:117] "RemoveContainer" containerID="2c74537f2f7f983a77be0c1cfba9e4697bfa9b5bd58dd46f3c76c2ed3a94163a" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.818546 4835 scope.go:117] "RemoveContainer" containerID="3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52" Oct 02 11:00:52 crc kubenswrapper[4835]: E1002 11:00:52.819014 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52\": container with ID starting with 3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52 not found: ID does not exist" containerID="3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.819049 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52"} err="failed to get container status \"3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52\": rpc error: code = NotFound desc = could not find container \"3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52\": container with ID starting with 3b4ea34684c80a3abc079f52d5529031bce1f81583fc52ec67fa4f05a5d56d52 not found: ID does not exist" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.819074 4835 scope.go:117] "RemoveContainer" containerID="b1762d44cdb5d9826fb174f977cc29fcd4fe4bfb7e9839432441d68f0ff7c080" Oct 02 11:00:52 crc kubenswrapper[4835]: E1002 11:00:52.819360 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1762d44cdb5d9826fb174f977cc29fcd4fe4bfb7e9839432441d68f0ff7c080\": container with ID starting with b1762d44cdb5d9826fb174f977cc29fcd4fe4bfb7e9839432441d68f0ff7c080 not found: ID does not exist" containerID="b1762d44cdb5d9826fb174f977cc29fcd4fe4bfb7e9839432441d68f0ff7c080" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.819390 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1762d44cdb5d9826fb174f977cc29fcd4fe4bfb7e9839432441d68f0ff7c080"} err="failed to get container status \"b1762d44cdb5d9826fb174f977cc29fcd4fe4bfb7e9839432441d68f0ff7c080\": rpc error: code = NotFound desc = could not find container \"b1762d44cdb5d9826fb174f977cc29fcd4fe4bfb7e9839432441d68f0ff7c080\": container with ID starting with b1762d44cdb5d9826fb174f977cc29fcd4fe4bfb7e9839432441d68f0ff7c080 not found: ID does not exist" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.819404 4835 scope.go:117] "RemoveContainer" containerID="2c74537f2f7f983a77be0c1cfba9e4697bfa9b5bd58dd46f3c76c2ed3a94163a" Oct 02 11:00:52 crc kubenswrapper[4835]: E1002 11:00:52.819669 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c74537f2f7f983a77be0c1cfba9e4697bfa9b5bd58dd46f3c76c2ed3a94163a\": container with ID starting with 2c74537f2f7f983a77be0c1cfba9e4697bfa9b5bd58dd46f3c76c2ed3a94163a not found: ID does not exist" containerID="2c74537f2f7f983a77be0c1cfba9e4697bfa9b5bd58dd46f3c76c2ed3a94163a" Oct 02 11:00:52 crc kubenswrapper[4835]: I1002 11:00:52.819697 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c74537f2f7f983a77be0c1cfba9e4697bfa9b5bd58dd46f3c76c2ed3a94163a"} err="failed to get container status \"2c74537f2f7f983a77be0c1cfba9e4697bfa9b5bd58dd46f3c76c2ed3a94163a\": rpc error: code = NotFound desc = could not find container \"2c74537f2f7f983a77be0c1cfba9e4697bfa9b5bd58dd46f3c76c2ed3a94163a\": container with ID starting with 2c74537f2f7f983a77be0c1cfba9e4697bfa9b5bd58dd46f3c76c2ed3a94163a not found: ID does not exist" Oct 02 11:00:54 crc kubenswrapper[4835]: I1002 11:00:54.258442 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" path="/var/lib/kubelet/pods/db1c8609-9e4d-49d2-9ffe-15075be7ada6/volumes" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.281204 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" podUID="3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" containerName="oauth-openshift" containerID="cri-o://c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86" gracePeriod=15 Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.676832 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.723402 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-bbb7d99d8-s557k"] Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.723829 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbd9159-5af9-4954-a9d6-29e3abb32763" containerName="extract-utilities" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.723857 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbd9159-5af9-4954-a9d6-29e3abb32763" containerName="extract-utilities" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.723866 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0927eb79-581a-448d-abb3-8c785e24274a" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.723872 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0927eb79-581a-448d-abb3-8c785e24274a" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.723889 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" containerName="extract-content" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.723895 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" containerName="extract-content" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.723906 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" containerName="oauth-openshift" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.723916 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" containerName="oauth-openshift" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.723939 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.723945 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.723955 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0927eb79-581a-448d-abb3-8c785e24274a" containerName="extract-content" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.723960 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0927eb79-581a-448d-abb3-8c785e24274a" containerName="extract-content" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.723971 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" containerName="extract-utilities" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.723977 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" containerName="extract-utilities" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.723985 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c401d6-7b1d-40f4-8570-d0ccb5b778fe" containerName="collect-profiles" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.723992 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c401d6-7b1d-40f4-8570-d0ccb5b778fe" containerName="collect-profiles" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.724001 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbd9159-5af9-4954-a9d6-29e3abb32763" containerName="extract-content" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724007 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbd9159-5af9-4954-a9d6-29e3abb32763" containerName="extract-content" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.724016 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerName="extract-utilities" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724022 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerName="extract-utilities" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.724032 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbd9159-5af9-4954-a9d6-29e3abb32763" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724039 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbd9159-5af9-4954-a9d6-29e3abb32763" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.724049 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0927eb79-581a-448d-abb3-8c785e24274a" containerName="extract-utilities" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724055 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0927eb79-581a-448d-abb3-8c785e24274a" containerName="extract-utilities" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.724131 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerName="extract-content" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724138 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerName="extract-content" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.724147 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724152 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724382 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c401d6-7b1d-40f4-8570-d0ccb5b778fe" containerName="collect-profiles" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724404 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c296c7cb-3e59-473c-af61-8c1bdc9366dc" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724416 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1c8609-9e4d-49d2-9ffe-15075be7ada6" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724425 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbd9159-5af9-4954-a9d6-29e3abb32763" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724483 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" containerName="oauth-openshift" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.724496 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0927eb79-581a-448d-abb3-8c785e24274a" containerName="registry-server" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.725999 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.727066 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bbb7d99d8-s557k"] Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.799638 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-service-ca\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.799692 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-error\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.799724 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b5rc\" (UniqueName: \"kubernetes.io/projected/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-kube-api-access-8b5rc\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.799772 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-session\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.799795 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-cliconfig\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.799813 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-dir\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.799847 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-serving-cert\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.799868 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-ocp-branding-template\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.799885 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-idp-0-file-data\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.799949 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-router-certs\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.799988 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-login\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800014 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-provider-selection\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800040 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-policies\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800058 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-trusted-ca-bundle\") pod \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\" (UID: \"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487\") " Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800189 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-service-ca\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800211 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j4bl\" (UniqueName: \"kubernetes.io/projected/35116f1b-0339-4cd2-ab15-263ba455920c-kube-api-access-5j4bl\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800251 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-session\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800271 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800288 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-template-error\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800316 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800345 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-audit-policies\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800370 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-template-login\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800387 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35116f1b-0339-4cd2-ab15-263ba455920c-audit-dir\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800403 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800431 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800448 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800465 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-router-certs\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800507 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800577 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.800895 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.801253 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.801896 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.802395 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.812505 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.812828 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-kube-api-access-8b5rc" (OuterVolumeSpecName: "kube-api-access-8b5rc") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "kube-api-access-8b5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.813639 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.814287 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.814651 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.814847 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.815681 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.815747 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.816088 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" (UID: "3c7e4e66-b9ef-43fb-a1b9-539bba8ec487"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.897907 4835 generic.go:334] "Generic (PLEG): container finished" podID="3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" containerID="c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86" exitCode=0 Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.898006 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.898029 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" event={"ID":"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487","Type":"ContainerDied","Data":"c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86"} Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.898149 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lm4nm" event={"ID":"3c7e4e66-b9ef-43fb-a1b9-539bba8ec487","Type":"ContainerDied","Data":"f8ed007ad497e0723d7f590c76360381a05b7ecb7cffb9cea0d7966f7fe6567a"} Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.898195 4835 scope.go:117] "RemoveContainer" containerID="c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902382 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902486 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-service-ca\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902514 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j4bl\" (UniqueName: \"kubernetes.io/projected/35116f1b-0339-4cd2-ab15-263ba455920c-kube-api-access-5j4bl\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902533 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-session\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902570 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902593 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-template-error\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902677 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902744 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-audit-policies\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902778 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-template-login\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902796 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35116f1b-0339-4cd2-ab15-263ba455920c-audit-dir\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902837 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902866 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.902908 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.903836 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35116f1b-0339-4cd2-ab15-263ba455920c-audit-dir\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905643 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-router-certs\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905731 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905746 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905757 4835 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905767 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905797 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905809 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905819 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905833 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905844 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905917 4835 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905927 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905939 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905967 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905977 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b5rc\" (UniqueName: \"kubernetes.io/projected/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487-kube-api-access-8b5rc\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.905997 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.906803 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-service-ca\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.906871 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.906876 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35116f1b-0339-4cd2-ab15-263ba455920c-audit-policies\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.912185 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-session\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.913758 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-template-login\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.913924 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.916329 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.916935 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-template-error\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.918432 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.918938 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-system-router-certs\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.923925 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35116f1b-0339-4cd2-ab15-263ba455920c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.925951 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j4bl\" (UniqueName: \"kubernetes.io/projected/35116f1b-0339-4cd2-ab15-263ba455920c-kube-api-access-5j4bl\") pod \"oauth-openshift-bbb7d99d8-s557k\" (UID: \"35116f1b-0339-4cd2-ab15-263ba455920c\") " pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.937742 4835 scope.go:117] "RemoveContainer" containerID="c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86" Oct 02 11:01:16 crc kubenswrapper[4835]: E1002 11:01:16.938661 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86\": container with ID starting with c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86 not found: ID does not exist" containerID="c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.938748 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86"} err="failed to get container status \"c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86\": rpc error: code = NotFound desc = could not find container \"c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86\": container with ID starting with c6e721589c6019f584d20dfeb28a837196a3fb675d2e6d9289412a392e3bce86 not found: ID does not exist" Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.946854 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lm4nm"] Oct 02 11:01:16 crc kubenswrapper[4835]: I1002 11:01:16.952575 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lm4nm"] Oct 02 11:01:17 crc kubenswrapper[4835]: I1002 11:01:17.042194 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:17 crc kubenswrapper[4835]: I1002 11:01:17.284682 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bbb7d99d8-s557k"] Oct 02 11:01:17 crc kubenswrapper[4835]: I1002 11:01:17.906627 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" event={"ID":"35116f1b-0339-4cd2-ab15-263ba455920c","Type":"ContainerStarted","Data":"cc2e3d43cef5e7df149dacfb3a100afdd2b308b7ef60f7ae59f95fd2898c34cd"} Oct 02 11:01:17 crc kubenswrapper[4835]: I1002 11:01:17.907164 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" event={"ID":"35116f1b-0339-4cd2-ab15-263ba455920c","Type":"ContainerStarted","Data":"7a7195eea022ccbe97bd6014d916d202846f7dd2ea870c743d6442e5ef32a599"} Oct 02 11:01:17 crc kubenswrapper[4835]: I1002 11:01:17.907192 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:17 crc kubenswrapper[4835]: I1002 11:01:17.933004 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" podStartSLOduration=26.932964385 podStartE2EDuration="26.932964385s" podCreationTimestamp="2025-10-02 11:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:01:17.930904085 +0000 UTC m=+354.490811666" watchObservedRunningTime="2025-10-02 11:01:17.932964385 +0000 UTC m=+354.492871996" Oct 02 11:01:18 crc kubenswrapper[4835]: I1002 11:01:18.096950 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-bbb7d99d8-s557k" Oct 02 11:01:18 crc kubenswrapper[4835]: I1002 11:01:18.260624 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c7e4e66-b9ef-43fb-a1b9-539bba8ec487" path="/var/lib/kubelet/pods/3c7e4e66-b9ef-43fb-a1b9-539bba8ec487/volumes" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.371362 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8qvnb"] Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.373079 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8qvnb" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerName="registry-server" containerID="cri-o://8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735" gracePeriod=30 Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.386618 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqdkv"] Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.387053 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nqdkv" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerName="registry-server" containerID="cri-o://f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61" gracePeriod=30 Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.434545 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-grwx8"] Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.435732 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" podUID="c1ab4193-73f3-4a23-a134-ca28f61c7eb0" containerName="marketplace-operator" containerID="cri-o://13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7" gracePeriod=30 Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.445119 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6clfp"] Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.446049 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6clfp" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" containerName="registry-server" containerID="cri-o://bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde" gracePeriod=30 Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.452789 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fdd7"] Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.453299 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5fdd7" podUID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" containerName="registry-server" containerID="cri-o://5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1" gracePeriod=30 Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.458587 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-25xg8"] Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.467370 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.472121 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-25xg8"] Oct 02 11:01:38 crc kubenswrapper[4835]: E1002 11:01:38.540780 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735 is running failed: container process not found" containerID="8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:01:38 crc kubenswrapper[4835]: E1002 11:01:38.541384 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735 is running failed: container process not found" containerID="8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:01:38 crc kubenswrapper[4835]: E1002 11:01:38.543553 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735 is running failed: container process not found" containerID="8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:01:38 crc kubenswrapper[4835]: E1002 11:01:38.543649 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-8qvnb" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerName="registry-server" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.550125 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbd4w\" (UniqueName: \"kubernetes.io/projected/a38861d2-5ab5-49ec-ac3e-1980fd30757a-kube-api-access-hbd4w\") pod \"marketplace-operator-79b997595-25xg8\" (UID: \"a38861d2-5ab5-49ec-ac3e-1980fd30757a\") " pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.550187 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a38861d2-5ab5-49ec-ac3e-1980fd30757a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-25xg8\" (UID: \"a38861d2-5ab5-49ec-ac3e-1980fd30757a\") " pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.550222 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a38861d2-5ab5-49ec-ac3e-1980fd30757a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-25xg8\" (UID: \"a38861d2-5ab5-49ec-ac3e-1980fd30757a\") " pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.652082 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbd4w\" (UniqueName: \"kubernetes.io/projected/a38861d2-5ab5-49ec-ac3e-1980fd30757a-kube-api-access-hbd4w\") pod \"marketplace-operator-79b997595-25xg8\" (UID: \"a38861d2-5ab5-49ec-ac3e-1980fd30757a\") " pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.652144 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a38861d2-5ab5-49ec-ac3e-1980fd30757a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-25xg8\" (UID: \"a38861d2-5ab5-49ec-ac3e-1980fd30757a\") " pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.652176 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a38861d2-5ab5-49ec-ac3e-1980fd30757a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-25xg8\" (UID: \"a38861d2-5ab5-49ec-ac3e-1980fd30757a\") " pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.656866 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a38861d2-5ab5-49ec-ac3e-1980fd30757a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-25xg8\" (UID: \"a38861d2-5ab5-49ec-ac3e-1980fd30757a\") " pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.661793 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a38861d2-5ab5-49ec-ac3e-1980fd30757a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-25xg8\" (UID: \"a38861d2-5ab5-49ec-ac3e-1980fd30757a\") " pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.677767 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbd4w\" (UniqueName: \"kubernetes.io/projected/a38861d2-5ab5-49ec-ac3e-1980fd30757a-kube-api-access-hbd4w\") pod \"marketplace-operator-79b997595-25xg8\" (UID: \"a38861d2-5ab5-49ec-ac3e-1980fd30757a\") " pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:38 crc kubenswrapper[4835]: E1002 11:01:38.681419 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61 is running failed: container process not found" containerID="f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:01:38 crc kubenswrapper[4835]: E1002 11:01:38.683066 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61 is running failed: container process not found" containerID="f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:01:38 crc kubenswrapper[4835]: E1002 11:01:38.684870 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61 is running failed: container process not found" containerID="f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 11:01:38 crc kubenswrapper[4835]: E1002 11:01:38.684979 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-nqdkv" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerName="registry-server" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.841216 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.848585 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.856594 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqdkv" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.871759 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.902441 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.925904 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.958723 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-catalog-content\") pod \"33d53270-0571-4a70-8b20-3b3f181e1a5c\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.959713 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-utilities\") pod \"33d53270-0571-4a70-8b20-3b3f181e1a5c\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.959890 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-catalog-content\") pod \"2557fb3c-0a66-47fb-95b2-64e41d22a740\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.959933 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-utilities\") pod \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.959988 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt2kj\" (UniqueName: \"kubernetes.io/projected/2557fb3c-0a66-47fb-95b2-64e41d22a740-kube-api-access-pt2kj\") pod \"2557fb3c-0a66-47fb-95b2-64e41d22a740\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.960028 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkll7\" (UniqueName: \"kubernetes.io/projected/33d53270-0571-4a70-8b20-3b3f181e1a5c-kube-api-access-tkll7\") pod \"33d53270-0571-4a70-8b20-3b3f181e1a5c\" (UID: \"33d53270-0571-4a70-8b20-3b3f181e1a5c\") " Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.960073 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-utilities\") pod \"2557fb3c-0a66-47fb-95b2-64e41d22a740\" (UID: \"2557fb3c-0a66-47fb-95b2-64e41d22a740\") " Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.960092 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-catalog-content\") pod \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.960117 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhbxv\" (UniqueName: \"kubernetes.io/projected/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-kube-api-access-lhbxv\") pod \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\" (UID: \"038c63fa-b6fc-4725-9f58-e9c3bfe9595d\") " Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.961825 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-utilities" (OuterVolumeSpecName: "utilities") pod "038c63fa-b6fc-4725-9f58-e9c3bfe9595d" (UID: "038c63fa-b6fc-4725-9f58-e9c3bfe9595d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.963756 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-utilities" (OuterVolumeSpecName: "utilities") pod "2557fb3c-0a66-47fb-95b2-64e41d22a740" (UID: "2557fb3c-0a66-47fb-95b2-64e41d22a740"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.964734 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d53270-0571-4a70-8b20-3b3f181e1a5c-kube-api-access-tkll7" (OuterVolumeSpecName: "kube-api-access-tkll7") pod "33d53270-0571-4a70-8b20-3b3f181e1a5c" (UID: "33d53270-0571-4a70-8b20-3b3f181e1a5c"). InnerVolumeSpecName "kube-api-access-tkll7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.970119 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-kube-api-access-lhbxv" (OuterVolumeSpecName: "kube-api-access-lhbxv") pod "038c63fa-b6fc-4725-9f58-e9c3bfe9595d" (UID: "038c63fa-b6fc-4725-9f58-e9c3bfe9595d"). InnerVolumeSpecName "kube-api-access-lhbxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.978611 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2557fb3c-0a66-47fb-95b2-64e41d22a740-kube-api-access-pt2kj" (OuterVolumeSpecName: "kube-api-access-pt2kj") pod "2557fb3c-0a66-47fb-95b2-64e41d22a740" (UID: "2557fb3c-0a66-47fb-95b2-64e41d22a740"). InnerVolumeSpecName "kube-api-access-pt2kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.986679 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33d53270-0571-4a70-8b20-3b3f181e1a5c" (UID: "33d53270-0571-4a70-8b20-3b3f181e1a5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:01:38 crc kubenswrapper[4835]: I1002 11:01:38.992323 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-utilities" (OuterVolumeSpecName: "utilities") pod "33d53270-0571-4a70-8b20-3b3f181e1a5c" (UID: "33d53270-0571-4a70-8b20-3b3f181e1a5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.058443 4835 generic.go:334] "Generic (PLEG): container finished" podID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" containerID="5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1" exitCode=0 Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.058517 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdd7" event={"ID":"9ce2263d-d93b-48cb-b9a5-ec10967c9730","Type":"ContainerDied","Data":"5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1"} Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.058551 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdd7" event={"ID":"9ce2263d-d93b-48cb-b9a5-ec10967c9730","Type":"ContainerDied","Data":"3796abc5c3b875b41c13a0f5dea8e8b6898ac1dceddb8a60ec0976a6d8a2b439"} Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.058569 4835 scope.go:117] "RemoveContainer" containerID="5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.058704 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdd7" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.062802 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-catalog-content\") pod \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.062887 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-operator-metrics\") pod \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.062946 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-utilities\") pod \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.062979 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-trusted-ca\") pod \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.063004 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f8gd\" (UniqueName: \"kubernetes.io/projected/9ce2263d-d93b-48cb-b9a5-ec10967c9730-kube-api-access-7f8gd\") pod \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\" (UID: \"9ce2263d-d93b-48cb-b9a5-ec10967c9730\") " Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.063809 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kflk9\" (UniqueName: \"kubernetes.io/projected/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-kube-api-access-kflk9\") pod \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\" (UID: \"c1ab4193-73f3-4a23-a134-ca28f61c7eb0\") " Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.064745 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-utilities" (OuterVolumeSpecName: "utilities") pod "9ce2263d-d93b-48cb-b9a5-ec10967c9730" (UID: "9ce2263d-d93b-48cb-b9a5-ec10967c9730"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.065828 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c1ab4193-73f3-4a23-a134-ca28f61c7eb0" (UID: "c1ab4193-73f3-4a23-a134-ca28f61c7eb0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.066835 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce2263d-d93b-48cb-b9a5-ec10967c9730-kube-api-access-7f8gd" (OuterVolumeSpecName: "kube-api-access-7f8gd") pod "9ce2263d-d93b-48cb-b9a5-ec10967c9730" (UID: "9ce2263d-d93b-48cb-b9a5-ec10967c9730"). InnerVolumeSpecName "kube-api-access-7f8gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.067594 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.067623 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt2kj\" (UniqueName: \"kubernetes.io/projected/2557fb3c-0a66-47fb-95b2-64e41d22a740-kube-api-access-pt2kj\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.067710 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkll7\" (UniqueName: \"kubernetes.io/projected/33d53270-0571-4a70-8b20-3b3f181e1a5c-kube-api-access-tkll7\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.067722 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.067732 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.067771 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f8gd\" (UniqueName: \"kubernetes.io/projected/9ce2263d-d93b-48cb-b9a5-ec10967c9730-kube-api-access-7f8gd\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.067781 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.067806 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhbxv\" (UniqueName: \"kubernetes.io/projected/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-kube-api-access-lhbxv\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.067816 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.067826 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33d53270-0571-4a70-8b20-3b3f181e1a5c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.069606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-kube-api-access-kflk9" (OuterVolumeSpecName: "kube-api-access-kflk9") pod "c1ab4193-73f3-4a23-a134-ca28f61c7eb0" (UID: "c1ab4193-73f3-4a23-a134-ca28f61c7eb0"). InnerVolumeSpecName "kube-api-access-kflk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.069824 4835 generic.go:334] "Generic (PLEG): container finished" podID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerID="8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735" exitCode=0 Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.071272 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qvnb" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.071296 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qvnb" event={"ID":"038c63fa-b6fc-4725-9f58-e9c3bfe9595d","Type":"ContainerDied","Data":"8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735"} Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.071402 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qvnb" event={"ID":"038c63fa-b6fc-4725-9f58-e9c3bfe9595d","Type":"ContainerDied","Data":"8e5ae7078fe9992bbb43e4852b5c8aedf9377bfa1d598772e1959e645bbae320"} Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.072051 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c1ab4193-73f3-4a23-a134-ca28f61c7eb0" (UID: "c1ab4193-73f3-4a23-a134-ca28f61c7eb0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.077984 4835 generic.go:334] "Generic (PLEG): container finished" podID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerID="f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61" exitCode=0 Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.078046 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdkv" event={"ID":"2557fb3c-0a66-47fb-95b2-64e41d22a740","Type":"ContainerDied","Data":"f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61"} Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.078075 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdkv" event={"ID":"2557fb3c-0a66-47fb-95b2-64e41d22a740","Type":"ContainerDied","Data":"88eced280bcf4ebd3e7a592b39ceae25c2566664f94b89111f5cfbe351469a5d"} Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.078148 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqdkv" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.082272 4835 generic.go:334] "Generic (PLEG): container finished" podID="33d53270-0571-4a70-8b20-3b3f181e1a5c" containerID="bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde" exitCode=0 Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.082330 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6clfp" event={"ID":"33d53270-0571-4a70-8b20-3b3f181e1a5c","Type":"ContainerDied","Data":"bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde"} Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.082349 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6clfp" event={"ID":"33d53270-0571-4a70-8b20-3b3f181e1a5c","Type":"ContainerDied","Data":"297ed8e37f834102d033a05ccd9b175660f5f1a46a64d681a2b01bf3f612b1d3"} Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.082409 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6clfp" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.087563 4835 scope.go:117] "RemoveContainer" containerID="385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.087903 4835 generic.go:334] "Generic (PLEG): container finished" podID="c1ab4193-73f3-4a23-a134-ca28f61c7eb0" containerID="13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7" exitCode=0 Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.087959 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" event={"ID":"c1ab4193-73f3-4a23-a134-ca28f61c7eb0","Type":"ContainerDied","Data":"13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7"} Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.088000 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" event={"ID":"c1ab4193-73f3-4a23-a134-ca28f61c7eb0","Type":"ContainerDied","Data":"673488e99a732a0ec10e3982c5c0c978beb9d8587e354acfefe10c0e67c4f94e"} Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.088079 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-grwx8" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.094983 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2557fb3c-0a66-47fb-95b2-64e41d22a740" (UID: "2557fb3c-0a66-47fb-95b2-64e41d22a740"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.099655 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "038c63fa-b6fc-4725-9f58-e9c3bfe9595d" (UID: "038c63fa-b6fc-4725-9f58-e9c3bfe9595d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.120924 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6clfp"] Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.125411 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6clfp"] Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.135354 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-grwx8"] Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.137904 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-grwx8"] Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.144973 4835 scope.go:117] "RemoveContainer" containerID="9f74805427fdc090fac79df212397ee54a0e9bad59628715621385fd19c2b65f" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.163986 4835 scope.go:117] "RemoveContainer" containerID="5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.164790 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1\": container with ID starting with 5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1 not found: ID does not exist" containerID="5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.164868 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1"} err="failed to get container status \"5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1\": rpc error: code = NotFound desc = could not find container \"5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1\": container with ID starting with 5d39665b252ab1fa2f3b7ed51c558226650397186ae67b8126a3349a3fa3cee1 not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.164906 4835 scope.go:117] "RemoveContainer" containerID="385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.165341 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff\": container with ID starting with 385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff not found: ID does not exist" containerID="385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.165392 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff"} err="failed to get container status \"385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff\": rpc error: code = NotFound desc = could not find container \"385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff\": container with ID starting with 385c3fcf044846c093839f145349a479d69d6eb893a866d6ab0c6dcbdaf7b2ff not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.165430 4835 scope.go:117] "RemoveContainer" containerID="9f74805427fdc090fac79df212397ee54a0e9bad59628715621385fd19c2b65f" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.165674 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f74805427fdc090fac79df212397ee54a0e9bad59628715621385fd19c2b65f\": container with ID starting with 9f74805427fdc090fac79df212397ee54a0e9bad59628715621385fd19c2b65f not found: ID does not exist" containerID="9f74805427fdc090fac79df212397ee54a0e9bad59628715621385fd19c2b65f" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.165702 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f74805427fdc090fac79df212397ee54a0e9bad59628715621385fd19c2b65f"} err="failed to get container status \"9f74805427fdc090fac79df212397ee54a0e9bad59628715621385fd19c2b65f\": rpc error: code = NotFound desc = could not find container \"9f74805427fdc090fac79df212397ee54a0e9bad59628715621385fd19c2b65f\": container with ID starting with 9f74805427fdc090fac79df212397ee54a0e9bad59628715621385fd19c2b65f not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.165720 4835 scope.go:117] "RemoveContainer" containerID="8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.165973 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ce2263d-d93b-48cb-b9a5-ec10967c9730" (UID: "9ce2263d-d93b-48cb-b9a5-ec10967c9730"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.170022 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038c63fa-b6fc-4725-9f58-e9c3bfe9595d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.170046 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kflk9\" (UniqueName: \"kubernetes.io/projected/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-kube-api-access-kflk9\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.170057 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2557fb3c-0a66-47fb-95b2-64e41d22a740-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.170065 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ce2263d-d93b-48cb-b9a5-ec10967c9730-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.170074 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c1ab4193-73f3-4a23-a134-ca28f61c7eb0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.179266 4835 scope.go:117] "RemoveContainer" containerID="246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.199851 4835 scope.go:117] "RemoveContainer" containerID="5cd481f54a16cd17c403a65a5f2e8479df416f4f079c1b6af0558d770f839e96" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.216519 4835 scope.go:117] "RemoveContainer" containerID="8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.217046 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735\": container with ID starting with 8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735 not found: ID does not exist" containerID="8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.217088 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735"} err="failed to get container status \"8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735\": rpc error: code = NotFound desc = could not find container \"8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735\": container with ID starting with 8d8cb9d391e2f27cfb2f3160264a6717cc0b47e8af92af6619364e8827a1f735 not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.217123 4835 scope.go:117] "RemoveContainer" containerID="246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.217700 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0\": container with ID starting with 246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0 not found: ID does not exist" containerID="246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.217731 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0"} err="failed to get container status \"246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0\": rpc error: code = NotFound desc = could not find container \"246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0\": container with ID starting with 246b8e0b475ef904d9dbd3fe7343a9f509d771a45fd7bffc45ec25416e9a5be0 not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.217748 4835 scope.go:117] "RemoveContainer" containerID="5cd481f54a16cd17c403a65a5f2e8479df416f4f079c1b6af0558d770f839e96" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.218143 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cd481f54a16cd17c403a65a5f2e8479df416f4f079c1b6af0558d770f839e96\": container with ID starting with 5cd481f54a16cd17c403a65a5f2e8479df416f4f079c1b6af0558d770f839e96 not found: ID does not exist" containerID="5cd481f54a16cd17c403a65a5f2e8479df416f4f079c1b6af0558d770f839e96" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.218167 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd481f54a16cd17c403a65a5f2e8479df416f4f079c1b6af0558d770f839e96"} err="failed to get container status \"5cd481f54a16cd17c403a65a5f2e8479df416f4f079c1b6af0558d770f839e96\": rpc error: code = NotFound desc = could not find container \"5cd481f54a16cd17c403a65a5f2e8479df416f4f079c1b6af0558d770f839e96\": container with ID starting with 5cd481f54a16cd17c403a65a5f2e8479df416f4f079c1b6af0558d770f839e96 not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.218189 4835 scope.go:117] "RemoveContainer" containerID="f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.239059 4835 scope.go:117] "RemoveContainer" containerID="51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.269130 4835 scope.go:117] "RemoveContainer" containerID="c9d428ee76da4d306c08dd7cc0200d6f8002556735fc4a23de10378fb94aa22c" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.313033 4835 scope.go:117] "RemoveContainer" containerID="f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.313520 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61\": container with ID starting with f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61 not found: ID does not exist" containerID="f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.313557 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61"} err="failed to get container status \"f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61\": rpc error: code = NotFound desc = could not find container \"f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61\": container with ID starting with f291b91fbd2ae92dce3d83a9032bb613fa74aee7e81f14c3a241fc39aa27bd61 not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.313587 4835 scope.go:117] "RemoveContainer" containerID="51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.313884 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6\": container with ID starting with 51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6 not found: ID does not exist" containerID="51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.313908 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6"} err="failed to get container status \"51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6\": rpc error: code = NotFound desc = could not find container \"51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6\": container with ID starting with 51b21121a48ed310b588e868f8ac180d173015e09b8bbfbc853605b5a5efdcd6 not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.313924 4835 scope.go:117] "RemoveContainer" containerID="c9d428ee76da4d306c08dd7cc0200d6f8002556735fc4a23de10378fb94aa22c" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.316558 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d428ee76da4d306c08dd7cc0200d6f8002556735fc4a23de10378fb94aa22c\": container with ID starting with c9d428ee76da4d306c08dd7cc0200d6f8002556735fc4a23de10378fb94aa22c not found: ID does not exist" containerID="c9d428ee76da4d306c08dd7cc0200d6f8002556735fc4a23de10378fb94aa22c" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.316582 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d428ee76da4d306c08dd7cc0200d6f8002556735fc4a23de10378fb94aa22c"} err="failed to get container status \"c9d428ee76da4d306c08dd7cc0200d6f8002556735fc4a23de10378fb94aa22c\": rpc error: code = NotFound desc = could not find container \"c9d428ee76da4d306c08dd7cc0200d6f8002556735fc4a23de10378fb94aa22c\": container with ID starting with c9d428ee76da4d306c08dd7cc0200d6f8002556735fc4a23de10378fb94aa22c not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.316605 4835 scope.go:117] "RemoveContainer" containerID="bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.357288 4835 scope.go:117] "RemoveContainer" containerID="8ed2db07248a43c5214276c0dee16883c38f34c280c9cb6dfdf896770c7ee773" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.380365 4835 scope.go:117] "RemoveContainer" containerID="994ac5b988c2b7963a6fb85cf66ca729b5aa65cd2ba33c1a3485fc6c07cbc854" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.398053 4835 scope.go:117] "RemoveContainer" containerID="bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.398661 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde\": container with ID starting with bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde not found: ID does not exist" containerID="bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.398691 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde"} err="failed to get container status \"bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde\": rpc error: code = NotFound desc = could not find container \"bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde\": container with ID starting with bb010cc1107c454d1afac60390b8162def3c99921ecb755a26c82ee8c590fdde not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.398713 4835 scope.go:117] "RemoveContainer" containerID="8ed2db07248a43c5214276c0dee16883c38f34c280c9cb6dfdf896770c7ee773" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.398930 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ed2db07248a43c5214276c0dee16883c38f34c280c9cb6dfdf896770c7ee773\": container with ID starting with 8ed2db07248a43c5214276c0dee16883c38f34c280c9cb6dfdf896770c7ee773 not found: ID does not exist" containerID="8ed2db07248a43c5214276c0dee16883c38f34c280c9cb6dfdf896770c7ee773" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.398950 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ed2db07248a43c5214276c0dee16883c38f34c280c9cb6dfdf896770c7ee773"} err="failed to get container status \"8ed2db07248a43c5214276c0dee16883c38f34c280c9cb6dfdf896770c7ee773\": rpc error: code = NotFound desc = could not find container \"8ed2db07248a43c5214276c0dee16883c38f34c280c9cb6dfdf896770c7ee773\": container with ID starting with 8ed2db07248a43c5214276c0dee16883c38f34c280c9cb6dfdf896770c7ee773 not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.398963 4835 scope.go:117] "RemoveContainer" containerID="994ac5b988c2b7963a6fb85cf66ca729b5aa65cd2ba33c1a3485fc6c07cbc854" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.399204 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"994ac5b988c2b7963a6fb85cf66ca729b5aa65cd2ba33c1a3485fc6c07cbc854\": container with ID starting with 994ac5b988c2b7963a6fb85cf66ca729b5aa65cd2ba33c1a3485fc6c07cbc854 not found: ID does not exist" containerID="994ac5b988c2b7963a6fb85cf66ca729b5aa65cd2ba33c1a3485fc6c07cbc854" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.399223 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"994ac5b988c2b7963a6fb85cf66ca729b5aa65cd2ba33c1a3485fc6c07cbc854"} err="failed to get container status \"994ac5b988c2b7963a6fb85cf66ca729b5aa65cd2ba33c1a3485fc6c07cbc854\": rpc error: code = NotFound desc = could not find container \"994ac5b988c2b7963a6fb85cf66ca729b5aa65cd2ba33c1a3485fc6c07cbc854\": container with ID starting with 994ac5b988c2b7963a6fb85cf66ca729b5aa65cd2ba33c1a3485fc6c07cbc854 not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.399233 4835 scope.go:117] "RemoveContainer" containerID="13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.404095 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fdd7"] Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.418967 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5fdd7"] Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.427853 4835 scope.go:117] "RemoveContainer" containerID="13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7" Oct 02 11:01:39 crc kubenswrapper[4835]: E1002 11:01:39.428193 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7\": container with ID starting with 13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7 not found: ID does not exist" containerID="13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.428272 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7"} err="failed to get container status \"13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7\": rpc error: code = NotFound desc = could not find container \"13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7\": container with ID starting with 13a4b78ade4b86c36fe1ebe859a78f0c154fcb355b848a72ffe4365462beded7 not found: ID does not exist" Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.431974 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-25xg8"] Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.434980 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqdkv"] Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.437621 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nqdkv"] Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.464079 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8qvnb"] Oct 02 11:01:39 crc kubenswrapper[4835]: I1002 11:01:39.473799 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8qvnb"] Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.098577 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" event={"ID":"a38861d2-5ab5-49ec-ac3e-1980fd30757a","Type":"ContainerStarted","Data":"c09ee9919ddd2d024c707d4077e78818342df960de5f18f831e17718e0d7be4d"} Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.099052 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" event={"ID":"a38861d2-5ab5-49ec-ac3e-1980fd30757a","Type":"ContainerStarted","Data":"ddbdda495c772913abffc0fa8fa36f98beb74728ce57d145bed6402d2b7c5a17"} Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.099101 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.103610 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.123849 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-25xg8" podStartSLOduration=2.123823761 podStartE2EDuration="2.123823761s" podCreationTimestamp="2025-10-02 11:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:01:40.119906326 +0000 UTC m=+376.679813907" watchObservedRunningTime="2025-10-02 11:01:40.123823761 +0000 UTC m=+376.683731342" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.264777 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" path="/var/lib/kubelet/pods/038c63fa-b6fc-4725-9f58-e9c3bfe9595d/volumes" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.266703 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" path="/var/lib/kubelet/pods/2557fb3c-0a66-47fb-95b2-64e41d22a740/volumes" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.268539 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" path="/var/lib/kubelet/pods/33d53270-0571-4a70-8b20-3b3f181e1a5c/volumes" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.271405 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" path="/var/lib/kubelet/pods/9ce2263d-d93b-48cb-b9a5-ec10967c9730/volumes" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.273101 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1ab4193-73f3-4a23-a134-ca28f61c7eb0" path="/var/lib/kubelet/pods/c1ab4193-73f3-4a23-a134-ca28f61c7eb0/volumes" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742326 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-52mhn"] Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742608 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerName="extract-content" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742622 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerName="extract-content" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742634 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerName="extract-utilities" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742640 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerName="extract-utilities" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742649 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" containerName="extract-content" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742656 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" containerName="extract-content" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742664 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742670 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742678 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ab4193-73f3-4a23-a134-ca28f61c7eb0" containerName="marketplace-operator" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742684 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ab4193-73f3-4a23-a134-ca28f61c7eb0" containerName="marketplace-operator" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742693 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" containerName="extract-utilities" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742698 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" containerName="extract-utilities" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742705 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742711 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742719 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" containerName="extract-content" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742725 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" containerName="extract-content" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742734 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" containerName="extract-utilities" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742741 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" containerName="extract-utilities" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742749 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerName="extract-utilities" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742755 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerName="extract-utilities" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742763 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerName="extract-content" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742769 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerName="extract-content" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742781 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742787 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: E1002 11:01:40.742793 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742799 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742892 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d53270-0571-4a70-8b20-3b3f181e1a5c" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742906 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2557fb3c-0a66-47fb-95b2-64e41d22a740" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742916 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="038c63fa-b6fc-4725-9f58-e9c3bfe9595d" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742924 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ab4193-73f3-4a23-a134-ca28f61c7eb0" containerName="marketplace-operator" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.742934 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce2263d-d93b-48cb-b9a5-ec10967c9730" containerName="registry-server" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.744062 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.746410 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.758843 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-52mhn"] Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.894399 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7cb39fe-2774-4e58-966a-78d55838e9f1-catalog-content\") pod \"redhat-marketplace-52mhn\" (UID: \"d7cb39fe-2774-4e58-966a-78d55838e9f1\") " pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.894467 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7cb39fe-2774-4e58-966a-78d55838e9f1-utilities\") pod \"redhat-marketplace-52mhn\" (UID: \"d7cb39fe-2774-4e58-966a-78d55838e9f1\") " pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.894509 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqpfg\" (UniqueName: \"kubernetes.io/projected/d7cb39fe-2774-4e58-966a-78d55838e9f1-kube-api-access-dqpfg\") pod \"redhat-marketplace-52mhn\" (UID: \"d7cb39fe-2774-4e58-966a-78d55838e9f1\") " pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.942340 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6dqb8"] Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.943656 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.946727 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.956474 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6dqb8"] Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.995584 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7cb39fe-2774-4e58-966a-78d55838e9f1-catalog-content\") pod \"redhat-marketplace-52mhn\" (UID: \"d7cb39fe-2774-4e58-966a-78d55838e9f1\") " pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.995660 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7cb39fe-2774-4e58-966a-78d55838e9f1-utilities\") pod \"redhat-marketplace-52mhn\" (UID: \"d7cb39fe-2774-4e58-966a-78d55838e9f1\") " pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.995697 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqpfg\" (UniqueName: \"kubernetes.io/projected/d7cb39fe-2774-4e58-966a-78d55838e9f1-kube-api-access-dqpfg\") pod \"redhat-marketplace-52mhn\" (UID: \"d7cb39fe-2774-4e58-966a-78d55838e9f1\") " pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.996269 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7cb39fe-2774-4e58-966a-78d55838e9f1-utilities\") pod \"redhat-marketplace-52mhn\" (UID: \"d7cb39fe-2774-4e58-966a-78d55838e9f1\") " pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:40 crc kubenswrapper[4835]: I1002 11:01:40.996367 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7cb39fe-2774-4e58-966a-78d55838e9f1-catalog-content\") pod \"redhat-marketplace-52mhn\" (UID: \"d7cb39fe-2774-4e58-966a-78d55838e9f1\") " pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.019337 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqpfg\" (UniqueName: \"kubernetes.io/projected/d7cb39fe-2774-4e58-966a-78d55838e9f1-kube-api-access-dqpfg\") pod \"redhat-marketplace-52mhn\" (UID: \"d7cb39fe-2774-4e58-966a-78d55838e9f1\") " pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.078933 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.097140 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz47w\" (UniqueName: \"kubernetes.io/projected/e36c5663-20a7-467b-a112-9f6a409bde0a-kube-api-access-bz47w\") pod \"certified-operators-6dqb8\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.097726 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-utilities\") pod \"certified-operators-6dqb8\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.097779 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-catalog-content\") pod \"certified-operators-6dqb8\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.199620 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz47w\" (UniqueName: \"kubernetes.io/projected/e36c5663-20a7-467b-a112-9f6a409bde0a-kube-api-access-bz47w\") pod \"certified-operators-6dqb8\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.199666 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-utilities\") pod \"certified-operators-6dqb8\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.199700 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-catalog-content\") pod \"certified-operators-6dqb8\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.200258 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-catalog-content\") pod \"certified-operators-6dqb8\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.201081 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-utilities\") pod \"certified-operators-6dqb8\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.223485 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz47w\" (UniqueName: \"kubernetes.io/projected/e36c5663-20a7-467b-a112-9f6a409bde0a-kube-api-access-bz47w\") pod \"certified-operators-6dqb8\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.256979 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-52mhn"] Oct 02 11:01:41 crc kubenswrapper[4835]: W1002 11:01:41.267297 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7cb39fe_2774_4e58_966a_78d55838e9f1.slice/crio-f64e2516113b2bf37069562848c1562fae03b9c3e90a3c0246fea0e19dc29e48 WatchSource:0}: Error finding container f64e2516113b2bf37069562848c1562fae03b9c3e90a3c0246fea0e19dc29e48: Status 404 returned error can't find the container with id f64e2516113b2bf37069562848c1562fae03b9c3e90a3c0246fea0e19dc29e48 Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.276067 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:41 crc kubenswrapper[4835]: I1002 11:01:41.485188 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6dqb8"] Oct 02 11:01:41 crc kubenswrapper[4835]: W1002 11:01:41.494743 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode36c5663_20a7_467b_a112_9f6a409bde0a.slice/crio-25a29a990b5debb9ad9d057fdcafa915cf55a75abfbdf3f283480916e705f07e WatchSource:0}: Error finding container 25a29a990b5debb9ad9d057fdcafa915cf55a75abfbdf3f283480916e705f07e: Status 404 returned error can't find the container with id 25a29a990b5debb9ad9d057fdcafa915cf55a75abfbdf3f283480916e705f07e Oct 02 11:01:42 crc kubenswrapper[4835]: I1002 11:01:42.124850 4835 generic.go:334] "Generic (PLEG): container finished" podID="e36c5663-20a7-467b-a112-9f6a409bde0a" containerID="8d9761140605e7a8ad46e61d380f11e029e80df5f83ba81570d757d9114fc351" exitCode=0 Oct 02 11:01:42 crc kubenswrapper[4835]: I1002 11:01:42.124956 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dqb8" event={"ID":"e36c5663-20a7-467b-a112-9f6a409bde0a","Type":"ContainerDied","Data":"8d9761140605e7a8ad46e61d380f11e029e80df5f83ba81570d757d9114fc351"} Oct 02 11:01:42 crc kubenswrapper[4835]: I1002 11:01:42.125037 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dqb8" event={"ID":"e36c5663-20a7-467b-a112-9f6a409bde0a","Type":"ContainerStarted","Data":"25a29a990b5debb9ad9d057fdcafa915cf55a75abfbdf3f283480916e705f07e"} Oct 02 11:01:42 crc kubenswrapper[4835]: I1002 11:01:42.128665 4835 generic.go:334] "Generic (PLEG): container finished" podID="d7cb39fe-2774-4e58-966a-78d55838e9f1" containerID="f09e0e3c7a07a223d20ef0a96af5ae159459a1433c0c3e37a7f77cae513fcebc" exitCode=0 Oct 02 11:01:42 crc kubenswrapper[4835]: I1002 11:01:42.129635 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52mhn" event={"ID":"d7cb39fe-2774-4e58-966a-78d55838e9f1","Type":"ContainerDied","Data":"f09e0e3c7a07a223d20ef0a96af5ae159459a1433c0c3e37a7f77cae513fcebc"} Oct 02 11:01:42 crc kubenswrapper[4835]: I1002 11:01:42.129832 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52mhn" event={"ID":"d7cb39fe-2774-4e58-966a-78d55838e9f1","Type":"ContainerStarted","Data":"f64e2516113b2bf37069562848c1562fae03b9c3e90a3c0246fea0e19dc29e48"} Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.136431 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9stw4"] Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.138994 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.141761 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.148907 4835 generic.go:334] "Generic (PLEG): container finished" podID="d7cb39fe-2774-4e58-966a-78d55838e9f1" containerID="c374f4aa6d913666a5d8d1d2fbf8773c89abd9e9e79a160f5258d380e7b7b3c2" exitCode=0 Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.149007 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52mhn" event={"ID":"d7cb39fe-2774-4e58-966a-78d55838e9f1","Type":"ContainerDied","Data":"c374f4aa6d913666a5d8d1d2fbf8773c89abd9e9e79a160f5258d380e7b7b3c2"} Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.151526 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9stw4"] Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.152306 4835 generic.go:334] "Generic (PLEG): container finished" podID="e36c5663-20a7-467b-a112-9f6a409bde0a" containerID="b9b0d2c1a0da509b044da5a26c60a870e4f3ca31a05c539024fe5e01cd084bf4" exitCode=0 Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.152346 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dqb8" event={"ID":"e36c5663-20a7-467b-a112-9f6a409bde0a","Type":"ContainerDied","Data":"b9b0d2c1a0da509b044da5a26c60a870e4f3ca31a05c539024fe5e01cd084bf4"} Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.229467 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzvnx\" (UniqueName: \"kubernetes.io/projected/e5b7c17f-8144-41de-bd62-b8ec03c34fbf-kube-api-access-tzvnx\") pod \"redhat-operators-9stw4\" (UID: \"e5b7c17f-8144-41de-bd62-b8ec03c34fbf\") " pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.229536 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b7c17f-8144-41de-bd62-b8ec03c34fbf-utilities\") pod \"redhat-operators-9stw4\" (UID: \"e5b7c17f-8144-41de-bd62-b8ec03c34fbf\") " pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.229575 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b7c17f-8144-41de-bd62-b8ec03c34fbf-catalog-content\") pod \"redhat-operators-9stw4\" (UID: \"e5b7c17f-8144-41de-bd62-b8ec03c34fbf\") " pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.330466 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzvnx\" (UniqueName: \"kubernetes.io/projected/e5b7c17f-8144-41de-bd62-b8ec03c34fbf-kube-api-access-tzvnx\") pod \"redhat-operators-9stw4\" (UID: \"e5b7c17f-8144-41de-bd62-b8ec03c34fbf\") " pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.330907 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b7c17f-8144-41de-bd62-b8ec03c34fbf-utilities\") pod \"redhat-operators-9stw4\" (UID: \"e5b7c17f-8144-41de-bd62-b8ec03c34fbf\") " pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.332866 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b7c17f-8144-41de-bd62-b8ec03c34fbf-catalog-content\") pod \"redhat-operators-9stw4\" (UID: \"e5b7c17f-8144-41de-bd62-b8ec03c34fbf\") " pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.332107 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b7c17f-8144-41de-bd62-b8ec03c34fbf-utilities\") pod \"redhat-operators-9stw4\" (UID: \"e5b7c17f-8144-41de-bd62-b8ec03c34fbf\") " pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.333989 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b7c17f-8144-41de-bd62-b8ec03c34fbf-catalog-content\") pod \"redhat-operators-9stw4\" (UID: \"e5b7c17f-8144-41de-bd62-b8ec03c34fbf\") " pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.337606 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m6blb"] Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.338945 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.341425 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.351564 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6blb"] Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.371180 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzvnx\" (UniqueName: \"kubernetes.io/projected/e5b7c17f-8144-41de-bd62-b8ec03c34fbf-kube-api-access-tzvnx\") pod \"redhat-operators-9stw4\" (UID: \"e5b7c17f-8144-41de-bd62-b8ec03c34fbf\") " pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.435860 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d544c135-38aa-4425-baf2-765c1f899617-utilities\") pod \"community-operators-m6blb\" (UID: \"d544c135-38aa-4425-baf2-765c1f899617\") " pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.436097 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d544c135-38aa-4425-baf2-765c1f899617-catalog-content\") pod \"community-operators-m6blb\" (UID: \"d544c135-38aa-4425-baf2-765c1f899617\") " pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.436230 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrxdp\" (UniqueName: \"kubernetes.io/projected/d544c135-38aa-4425-baf2-765c1f899617-kube-api-access-vrxdp\") pod \"community-operators-m6blb\" (UID: \"d544c135-38aa-4425-baf2-765c1f899617\") " pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.514771 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.538404 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d544c135-38aa-4425-baf2-765c1f899617-catalog-content\") pod \"community-operators-m6blb\" (UID: \"d544c135-38aa-4425-baf2-765c1f899617\") " pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.538621 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrxdp\" (UniqueName: \"kubernetes.io/projected/d544c135-38aa-4425-baf2-765c1f899617-kube-api-access-vrxdp\") pod \"community-operators-m6blb\" (UID: \"d544c135-38aa-4425-baf2-765c1f899617\") " pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.538961 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d544c135-38aa-4425-baf2-765c1f899617-utilities\") pod \"community-operators-m6blb\" (UID: \"d544c135-38aa-4425-baf2-765c1f899617\") " pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.539654 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d544c135-38aa-4425-baf2-765c1f899617-utilities\") pod \"community-operators-m6blb\" (UID: \"d544c135-38aa-4425-baf2-765c1f899617\") " pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.539761 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d544c135-38aa-4425-baf2-765c1f899617-catalog-content\") pod \"community-operators-m6blb\" (UID: \"d544c135-38aa-4425-baf2-765c1f899617\") " pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.562941 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrxdp\" (UniqueName: \"kubernetes.io/projected/d544c135-38aa-4425-baf2-765c1f899617-kube-api-access-vrxdp\") pod \"community-operators-m6blb\" (UID: \"d544c135-38aa-4425-baf2-765c1f899617\") " pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.664865 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.719279 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9stw4"] Oct 02 11:01:43 crc kubenswrapper[4835]: I1002 11:01:43.881111 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6blb"] Oct 02 11:01:44 crc kubenswrapper[4835]: I1002 11:01:44.159968 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52mhn" event={"ID":"d7cb39fe-2774-4e58-966a-78d55838e9f1","Type":"ContainerStarted","Data":"b11bc453cd88faeb72b4232ad1778aa0d342740f32382e281e6826bc642ac684"} Oct 02 11:01:44 crc kubenswrapper[4835]: I1002 11:01:44.162752 4835 generic.go:334] "Generic (PLEG): container finished" podID="e5b7c17f-8144-41de-bd62-b8ec03c34fbf" containerID="65643998cd109678bb8b41b8e950436cfa0bbca3ded86d4a9682d92a877a27a8" exitCode=0 Oct 02 11:01:44 crc kubenswrapper[4835]: I1002 11:01:44.162826 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9stw4" event={"ID":"e5b7c17f-8144-41de-bd62-b8ec03c34fbf","Type":"ContainerDied","Data":"65643998cd109678bb8b41b8e950436cfa0bbca3ded86d4a9682d92a877a27a8"} Oct 02 11:01:44 crc kubenswrapper[4835]: I1002 11:01:44.162860 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9stw4" event={"ID":"e5b7c17f-8144-41de-bd62-b8ec03c34fbf","Type":"ContainerStarted","Data":"3a6b4463a96ca16d21fab439b91104c4b4be38308136e43916805ea0f78edee7"} Oct 02 11:01:44 crc kubenswrapper[4835]: I1002 11:01:44.166651 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dqb8" event={"ID":"e36c5663-20a7-467b-a112-9f6a409bde0a","Type":"ContainerStarted","Data":"775a493269a22a1ffdea195fdb8546262ad75863982f43ebe4042eadac647b5a"} Oct 02 11:01:44 crc kubenswrapper[4835]: I1002 11:01:44.169570 4835 generic.go:334] "Generic (PLEG): container finished" podID="d544c135-38aa-4425-baf2-765c1f899617" containerID="152d84968d47a28cf13f2341e526b9200181fdefebaa2b818966dce6f2606071" exitCode=0 Oct 02 11:01:44 crc kubenswrapper[4835]: I1002 11:01:44.169607 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6blb" event={"ID":"d544c135-38aa-4425-baf2-765c1f899617","Type":"ContainerDied","Data":"152d84968d47a28cf13f2341e526b9200181fdefebaa2b818966dce6f2606071"} Oct 02 11:01:44 crc kubenswrapper[4835]: I1002 11:01:44.169624 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6blb" event={"ID":"d544c135-38aa-4425-baf2-765c1f899617","Type":"ContainerStarted","Data":"a425f308e075f6bab175a233376ce4a4914bad2c792c481ae5399f75ff6d1a8e"} Oct 02 11:01:44 crc kubenswrapper[4835]: I1002 11:01:44.182905 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-52mhn" podStartSLOduration=2.404283 podStartE2EDuration="4.18288451s" podCreationTimestamp="2025-10-02 11:01:40 +0000 UTC" firstStartedPulling="2025-10-02 11:01:42.131315802 +0000 UTC m=+378.691223423" lastFinishedPulling="2025-10-02 11:01:43.909917352 +0000 UTC m=+380.469824933" observedRunningTime="2025-10-02 11:01:44.181006024 +0000 UTC m=+380.740913605" watchObservedRunningTime="2025-10-02 11:01:44.18288451 +0000 UTC m=+380.742792101" Oct 02 11:01:44 crc kubenswrapper[4835]: I1002 11:01:44.209466 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6dqb8" podStartSLOduration=2.4600824230000002 podStartE2EDuration="4.209443762s" podCreationTimestamp="2025-10-02 11:01:40 +0000 UTC" firstStartedPulling="2025-10-02 11:01:42.131007183 +0000 UTC m=+378.690914764" lastFinishedPulling="2025-10-02 11:01:43.880368512 +0000 UTC m=+380.440276103" observedRunningTime="2025-10-02 11:01:44.205559177 +0000 UTC m=+380.765466768" watchObservedRunningTime="2025-10-02 11:01:44.209443762 +0000 UTC m=+380.769351343" Oct 02 11:01:45 crc kubenswrapper[4835]: I1002 11:01:45.178178 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9stw4" event={"ID":"e5b7c17f-8144-41de-bd62-b8ec03c34fbf","Type":"ContainerStarted","Data":"fda3deb3526481b4a1fbb7bcf68983639f4cfb038c5cf9c0321a1f313d44766a"} Oct 02 11:01:45 crc kubenswrapper[4835]: I1002 11:01:45.179788 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6blb" event={"ID":"d544c135-38aa-4425-baf2-765c1f899617","Type":"ContainerStarted","Data":"7d8bd74d49effb00db50bf12fc920cdf72d8a6b60c5d84cd3c841b7dc13f2346"} Oct 02 11:01:46 crc kubenswrapper[4835]: I1002 11:01:46.194683 4835 generic.go:334] "Generic (PLEG): container finished" podID="e5b7c17f-8144-41de-bd62-b8ec03c34fbf" containerID="fda3deb3526481b4a1fbb7bcf68983639f4cfb038c5cf9c0321a1f313d44766a" exitCode=0 Oct 02 11:01:46 crc kubenswrapper[4835]: I1002 11:01:46.194747 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9stw4" event={"ID":"e5b7c17f-8144-41de-bd62-b8ec03c34fbf","Type":"ContainerDied","Data":"fda3deb3526481b4a1fbb7bcf68983639f4cfb038c5cf9c0321a1f313d44766a"} Oct 02 11:01:46 crc kubenswrapper[4835]: I1002 11:01:46.198025 4835 generic.go:334] "Generic (PLEG): container finished" podID="d544c135-38aa-4425-baf2-765c1f899617" containerID="7d8bd74d49effb00db50bf12fc920cdf72d8a6b60c5d84cd3c841b7dc13f2346" exitCode=0 Oct 02 11:01:46 crc kubenswrapper[4835]: I1002 11:01:46.198083 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6blb" event={"ID":"d544c135-38aa-4425-baf2-765c1f899617","Type":"ContainerDied","Data":"7d8bd74d49effb00db50bf12fc920cdf72d8a6b60c5d84cd3c841b7dc13f2346"} Oct 02 11:01:48 crc kubenswrapper[4835]: I1002 11:01:48.219915 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9stw4" event={"ID":"e5b7c17f-8144-41de-bd62-b8ec03c34fbf","Type":"ContainerStarted","Data":"e18d40c746ad74b876ebc2b9fb9877f3f3cbda2b16028a1a49fecb3a4f052686"} Oct 02 11:01:48 crc kubenswrapper[4835]: I1002 11:01:48.223315 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6blb" event={"ID":"d544c135-38aa-4425-baf2-765c1f899617","Type":"ContainerStarted","Data":"291823f2693c2e15c3806cff44c393739905b98692338457c0a1fbd0bbe7f43f"} Oct 02 11:01:48 crc kubenswrapper[4835]: I1002 11:01:48.242744 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9stw4" podStartSLOduration=2.4506981310000002 podStartE2EDuration="5.242711371s" podCreationTimestamp="2025-10-02 11:01:43 +0000 UTC" firstStartedPulling="2025-10-02 11:01:44.164034815 +0000 UTC m=+380.723942396" lastFinishedPulling="2025-10-02 11:01:46.956048015 +0000 UTC m=+383.515955636" observedRunningTime="2025-10-02 11:01:48.241129124 +0000 UTC m=+384.801036705" watchObservedRunningTime="2025-10-02 11:01:48.242711371 +0000 UTC m=+384.802618952" Oct 02 11:01:48 crc kubenswrapper[4835]: I1002 11:01:48.267594 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m6blb" podStartSLOduration=2.845610819 podStartE2EDuration="5.26747476s" podCreationTimestamp="2025-10-02 11:01:43 +0000 UTC" firstStartedPulling="2025-10-02 11:01:44.170950928 +0000 UTC m=+380.730858509" lastFinishedPulling="2025-10-02 11:01:46.592814869 +0000 UTC m=+383.152722450" observedRunningTime="2025-10-02 11:01:48.265732438 +0000 UTC m=+384.825640019" watchObservedRunningTime="2025-10-02 11:01:48.26747476 +0000 UTC m=+384.827382341" Oct 02 11:01:51 crc kubenswrapper[4835]: I1002 11:01:51.079853 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:51 crc kubenswrapper[4835]: I1002 11:01:51.080408 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:51 crc kubenswrapper[4835]: I1002 11:01:51.144470 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:51 crc kubenswrapper[4835]: I1002 11:01:51.280846 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:51 crc kubenswrapper[4835]: I1002 11:01:51.280949 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:51 crc kubenswrapper[4835]: I1002 11:01:51.301034 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-52mhn" Oct 02 11:01:51 crc kubenswrapper[4835]: I1002 11:01:51.330580 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:52 crc kubenswrapper[4835]: I1002 11:01:52.294124 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 11:01:53 crc kubenswrapper[4835]: I1002 11:01:53.515717 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:53 crc kubenswrapper[4835]: I1002 11:01:53.516282 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:53 crc kubenswrapper[4835]: I1002 11:01:53.575522 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:53 crc kubenswrapper[4835]: I1002 11:01:53.664988 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:53 crc kubenswrapper[4835]: I1002 11:01:53.665140 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:53 crc kubenswrapper[4835]: I1002 11:01:53.727644 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:01:54 crc kubenswrapper[4835]: I1002 11:01:54.313938 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9stw4" Oct 02 11:01:54 crc kubenswrapper[4835]: I1002 11:01:54.314038 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m6blb" Oct 02 11:02:11 crc kubenswrapper[4835]: I1002 11:02:11.984619 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:02:11 crc kubenswrapper[4835]: I1002 11:02:11.985555 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.550057 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-j2rmj"] Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.550787 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.574694 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-j2rmj"] Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.719131 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-bound-sa-token\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.719571 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-registry-tls\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.719639 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-registry-certificates\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.719671 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-installation-pull-secrets\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.719705 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-ca-trust-extracted\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.719722 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-trusted-ca\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.719745 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q55z\" (UniqueName: \"kubernetes.io/projected/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-kube-api-access-5q55z\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.719777 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.749079 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.821714 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-ca-trust-extracted\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.821783 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-trusted-ca\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.821800 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q55z\" (UniqueName: \"kubernetes.io/projected/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-kube-api-access-5q55z\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.821908 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-bound-sa-token\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.821939 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-registry-tls\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.821968 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-registry-certificates\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.821996 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-installation-pull-secrets\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.822612 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-ca-trust-extracted\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.829665 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-installation-pull-secrets\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.908777 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-trusted-ca\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.908911 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-registry-certificates\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.911249 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-registry-tls\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.911645 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q55z\" (UniqueName: \"kubernetes.io/projected/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-kube-api-access-5q55z\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.912153 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06-bound-sa-token\") pod \"image-registry-66df7c8f76-j2rmj\" (UID: \"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:13 crc kubenswrapper[4835]: I1002 11:02:13.940628 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:14 crc kubenswrapper[4835]: I1002 11:02:14.363589 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-j2rmj"] Oct 02 11:02:14 crc kubenswrapper[4835]: I1002 11:02:14.404644 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" event={"ID":"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06","Type":"ContainerStarted","Data":"44fc006a48d63006e779fb11449292cce4eb8c5b3d03a8016ebac4ee714679bd"} Oct 02 11:02:15 crc kubenswrapper[4835]: I1002 11:02:15.413064 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" event={"ID":"abd0ad64-bd7e-4fc6-a1eb-a6e6ef2cac06","Type":"ContainerStarted","Data":"c80ec7a42b488b7e1ce4c5a275a5214304805f254995aafe3a3224b4d61be0e4"} Oct 02 11:02:15 crc kubenswrapper[4835]: I1002 11:02:15.413520 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:15 crc kubenswrapper[4835]: I1002 11:02:15.451387 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" podStartSLOduration=2.451348443 podStartE2EDuration="2.451348443s" podCreationTimestamp="2025-10-02 11:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:02:15.445156041 +0000 UTC m=+412.005063692" watchObservedRunningTime="2025-10-02 11:02:15.451348443 +0000 UTC m=+412.011256064" Oct 02 11:02:33 crc kubenswrapper[4835]: I1002 11:02:33.946017 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-j2rmj" Oct 02 11:02:33 crc kubenswrapper[4835]: I1002 11:02:33.997882 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f29zd"] Oct 02 11:02:41 crc kubenswrapper[4835]: I1002 11:02:41.985033 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:02:41 crc kubenswrapper[4835]: I1002 11:02:41.986554 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.036705 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" podUID="b9aa78d4-be0b-4eb3-9cde-117c72496d16" containerName="registry" containerID="cri-o://0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be" gracePeriod=30 Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.449186 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.555865 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b9aa78d4-be0b-4eb3-9cde-117c72496d16-installation-pull-secrets\") pod \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.556121 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.556194 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-tls\") pod \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.556274 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gd9l\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-kube-api-access-5gd9l\") pod \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.556299 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-trusted-ca\") pod \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.556327 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-certificates\") pod \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.556380 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-bound-sa-token\") pod \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.556412 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b9aa78d4-be0b-4eb3-9cde-117c72496d16-ca-trust-extracted\") pod \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\" (UID: \"b9aa78d4-be0b-4eb3-9cde-117c72496d16\") " Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.558906 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b9aa78d4-be0b-4eb3-9cde-117c72496d16" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.559062 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b9aa78d4-be0b-4eb3-9cde-117c72496d16" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.564461 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b9aa78d4-be0b-4eb3-9cde-117c72496d16" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.565240 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9aa78d4-be0b-4eb3-9cde-117c72496d16-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b9aa78d4-be0b-4eb3-9cde-117c72496d16" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.565537 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b9aa78d4-be0b-4eb3-9cde-117c72496d16" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.565729 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-kube-api-access-5gd9l" (OuterVolumeSpecName: "kube-api-access-5gd9l") pod "b9aa78d4-be0b-4eb3-9cde-117c72496d16" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16"). InnerVolumeSpecName "kube-api-access-5gd9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.571964 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b9aa78d4-be0b-4eb3-9cde-117c72496d16" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.581012 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9aa78d4-be0b-4eb3-9cde-117c72496d16-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b9aa78d4-be0b-4eb3-9cde-117c72496d16" (UID: "b9aa78d4-be0b-4eb3-9cde-117c72496d16"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.658142 4835 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b9aa78d4-be0b-4eb3-9cde-117c72496d16-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.658202 4835 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.658239 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gd9l\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-kube-api-access-5gd9l\") on node \"crc\" DevicePath \"\"" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.658248 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.658258 4835 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b9aa78d4-be0b-4eb3-9cde-117c72496d16-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.658266 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9aa78d4-be0b-4eb3-9cde-117c72496d16-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.658281 4835 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b9aa78d4-be0b-4eb3-9cde-117c72496d16-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.709960 4835 generic.go:334] "Generic (PLEG): container finished" podID="b9aa78d4-be0b-4eb3-9cde-117c72496d16" containerID="0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be" exitCode=0 Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.710545 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.711155 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" event={"ID":"b9aa78d4-be0b-4eb3-9cde-117c72496d16","Type":"ContainerDied","Data":"0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be"} Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.711530 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f29zd" event={"ID":"b9aa78d4-be0b-4eb3-9cde-117c72496d16","Type":"ContainerDied","Data":"e41f7659d341b1c5737dde89e41b8636204c5e7bcc6fb1240ec70fddd68bdca9"} Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.711583 4835 scope.go:117] "RemoveContainer" containerID="0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.743347 4835 scope.go:117] "RemoveContainer" containerID="0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be" Oct 02 11:02:59 crc kubenswrapper[4835]: E1002 11:02:59.744738 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be\": container with ID starting with 0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be not found: ID does not exist" containerID="0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.744796 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be"} err="failed to get container status \"0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be\": rpc error: code = NotFound desc = could not find container \"0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be\": container with ID starting with 0f8c67f7163dc024e5abdfa99fd7a85739ca90b33a0d28be2f110ecd17a368be not found: ID does not exist" Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.759756 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f29zd"] Oct 02 11:02:59 crc kubenswrapper[4835]: I1002 11:02:59.769667 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f29zd"] Oct 02 11:03:00 crc kubenswrapper[4835]: I1002 11:03:00.262727 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9aa78d4-be0b-4eb3-9cde-117c72496d16" path="/var/lib/kubelet/pods/b9aa78d4-be0b-4eb3-9cde-117c72496d16/volumes" Oct 02 11:03:11 crc kubenswrapper[4835]: I1002 11:03:11.984690 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:03:11 crc kubenswrapper[4835]: I1002 11:03:11.985600 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:03:11 crc kubenswrapper[4835]: I1002 11:03:11.985712 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:03:11 crc kubenswrapper[4835]: I1002 11:03:11.986635 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c5852139e1391baa79b13c185dfc951b255ae7419577505db83d6247b44ad1d"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:03:11 crc kubenswrapper[4835]: I1002 11:03:11.986767 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://9c5852139e1391baa79b13c185dfc951b255ae7419577505db83d6247b44ad1d" gracePeriod=600 Oct 02 11:03:13 crc kubenswrapper[4835]: I1002 11:03:12.806857 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="9c5852139e1391baa79b13c185dfc951b255ae7419577505db83d6247b44ad1d" exitCode=0 Oct 02 11:03:13 crc kubenswrapper[4835]: I1002 11:03:12.807024 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"9c5852139e1391baa79b13c185dfc951b255ae7419577505db83d6247b44ad1d"} Oct 02 11:03:13 crc kubenswrapper[4835]: I1002 11:03:12.807416 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"d053440f4d28c5e876a916d9ffabfb83edb8e183c19537340d638e68ba6f5bab"} Oct 02 11:03:13 crc kubenswrapper[4835]: I1002 11:03:12.807438 4835 scope.go:117] "RemoveContainer" containerID="7bea27615a843f9ca5b9a17bf4f5a5b705afe373d46f979673cc4246f427d00f" Oct 02 11:05:12 crc kubenswrapper[4835]: I1002 11:05:12.006730 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:05:12 crc kubenswrapper[4835]: I1002 11:05:12.007522 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:05:41 crc kubenswrapper[4835]: I1002 11:05:41.984653 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:05:41 crc kubenswrapper[4835]: I1002 11:05:41.985639 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:06:11 crc kubenswrapper[4835]: I1002 11:06:11.984665 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:06:11 crc kubenswrapper[4835]: I1002 11:06:11.985774 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:06:11 crc kubenswrapper[4835]: I1002 11:06:11.985859 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:06:11 crc kubenswrapper[4835]: I1002 11:06:11.986839 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d053440f4d28c5e876a916d9ffabfb83edb8e183c19537340d638e68ba6f5bab"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:06:11 crc kubenswrapper[4835]: I1002 11:06:11.987055 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://d053440f4d28c5e876a916d9ffabfb83edb8e183c19537340d638e68ba6f5bab" gracePeriod=600 Oct 02 11:06:13 crc kubenswrapper[4835]: I1002 11:06:13.032581 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="d053440f4d28c5e876a916d9ffabfb83edb8e183c19537340d638e68ba6f5bab" exitCode=0 Oct 02 11:06:13 crc kubenswrapper[4835]: I1002 11:06:13.032709 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"d053440f4d28c5e876a916d9ffabfb83edb8e183c19537340d638e68ba6f5bab"} Oct 02 11:06:13 crc kubenswrapper[4835]: I1002 11:06:13.033271 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"9482e972b371878a44442b6006c3a59a025e351c1ec2a25e635bbcea7c81f32b"} Oct 02 11:06:13 crc kubenswrapper[4835]: I1002 11:06:13.033316 4835 scope.go:117] "RemoveContainer" containerID="9c5852139e1391baa79b13c185dfc951b255ae7419577505db83d6247b44ad1d" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.170574 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zd9n5"] Oct 02 11:07:10 crc kubenswrapper[4835]: E1002 11:07:10.171398 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9aa78d4-be0b-4eb3-9cde-117c72496d16" containerName="registry" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.171417 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9aa78d4-be0b-4eb3-9cde-117c72496d16" containerName="registry" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.171528 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9aa78d4-be0b-4eb3-9cde-117c72496d16" containerName="registry" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.172017 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-zd9n5" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.179879 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-k2pjv"] Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.180176 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.180189 4835 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jjftp" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.180318 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.180622 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-k2pjv" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.184659 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zd9n5"] Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.184714 4835 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wzxbh" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.198667 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-k2pjv"] Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.201726 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lppfz"] Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.202513 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-lppfz" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.207821 4835 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-cpdsq" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.214390 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lppfz"] Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.253765 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j84d\" (UniqueName: \"kubernetes.io/projected/818c5e73-52cb-47f0-84b4-1931dd17f6e8-kube-api-access-7j84d\") pod \"cert-manager-cainjector-7f985d654d-zd9n5\" (UID: \"818c5e73-52cb-47f0-84b4-1931dd17f6e8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zd9n5" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.355286 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdggq\" (UniqueName: \"kubernetes.io/projected/d962c289-233d-4788-a761-50ff86c59da8-kube-api-access-mdggq\") pod \"cert-manager-5b446d88c5-k2pjv\" (UID: \"d962c289-233d-4788-a761-50ff86c59da8\") " pod="cert-manager/cert-manager-5b446d88c5-k2pjv" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.355433 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j84d\" (UniqueName: \"kubernetes.io/projected/818c5e73-52cb-47f0-84b4-1931dd17f6e8-kube-api-access-7j84d\") pod \"cert-manager-cainjector-7f985d654d-zd9n5\" (UID: \"818c5e73-52cb-47f0-84b4-1931dd17f6e8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zd9n5" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.355460 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnjhx\" (UniqueName: \"kubernetes.io/projected/dc8fd6c7-4ae9-49bf-b9ae-6e50882f670b-kube-api-access-tnjhx\") pod \"cert-manager-webhook-5655c58dd6-lppfz\" (UID: \"dc8fd6c7-4ae9-49bf-b9ae-6e50882f670b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lppfz" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.373373 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j84d\" (UniqueName: \"kubernetes.io/projected/818c5e73-52cb-47f0-84b4-1931dd17f6e8-kube-api-access-7j84d\") pod \"cert-manager-cainjector-7f985d654d-zd9n5\" (UID: \"818c5e73-52cb-47f0-84b4-1931dd17f6e8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zd9n5" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.456313 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnjhx\" (UniqueName: \"kubernetes.io/projected/dc8fd6c7-4ae9-49bf-b9ae-6e50882f670b-kube-api-access-tnjhx\") pod \"cert-manager-webhook-5655c58dd6-lppfz\" (UID: \"dc8fd6c7-4ae9-49bf-b9ae-6e50882f670b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lppfz" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.456717 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdggq\" (UniqueName: \"kubernetes.io/projected/d962c289-233d-4788-a761-50ff86c59da8-kube-api-access-mdggq\") pod \"cert-manager-5b446d88c5-k2pjv\" (UID: \"d962c289-233d-4788-a761-50ff86c59da8\") " pod="cert-manager/cert-manager-5b446d88c5-k2pjv" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.475146 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnjhx\" (UniqueName: \"kubernetes.io/projected/dc8fd6c7-4ae9-49bf-b9ae-6e50882f670b-kube-api-access-tnjhx\") pod \"cert-manager-webhook-5655c58dd6-lppfz\" (UID: \"dc8fd6c7-4ae9-49bf-b9ae-6e50882f670b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lppfz" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.478340 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdggq\" (UniqueName: \"kubernetes.io/projected/d962c289-233d-4788-a761-50ff86c59da8-kube-api-access-mdggq\") pod \"cert-manager-5b446d88c5-k2pjv\" (UID: \"d962c289-233d-4788-a761-50ff86c59da8\") " pod="cert-manager/cert-manager-5b446d88c5-k2pjv" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.495474 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-zd9n5" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.500935 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-k2pjv" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.525412 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-lppfz" Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.710714 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-k2pjv"] Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.730030 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.772200 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zd9n5"] Oct 02 11:07:10 crc kubenswrapper[4835]: W1002 11:07:10.784579 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod818c5e73_52cb_47f0_84b4_1931dd17f6e8.slice/crio-ffd9e3bb09c4c2b363650551545a2aec3c5dd3130a96fc18d65bc7c862d6d26c WatchSource:0}: Error finding container ffd9e3bb09c4c2b363650551545a2aec3c5dd3130a96fc18d65bc7c862d6d26c: Status 404 returned error can't find the container with id ffd9e3bb09c4c2b363650551545a2aec3c5dd3130a96fc18d65bc7c862d6d26c Oct 02 11:07:10 crc kubenswrapper[4835]: I1002 11:07:10.805915 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lppfz"] Oct 02 11:07:10 crc kubenswrapper[4835]: W1002 11:07:10.808144 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc8fd6c7_4ae9_49bf_b9ae_6e50882f670b.slice/crio-4c5c26c39a96a5b621d03bdc4ed789791ad1fd95eefa1c710a8fee8cf7f16b2b WatchSource:0}: Error finding container 4c5c26c39a96a5b621d03bdc4ed789791ad1fd95eefa1c710a8fee8cf7f16b2b: Status 404 returned error can't find the container with id 4c5c26c39a96a5b621d03bdc4ed789791ad1fd95eefa1c710a8fee8cf7f16b2b Oct 02 11:07:11 crc kubenswrapper[4835]: I1002 11:07:11.389212 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-k2pjv" event={"ID":"d962c289-233d-4788-a761-50ff86c59da8","Type":"ContainerStarted","Data":"c5e75dc153b812338cc5e96470259b99de4f188404c2e1f216e1c65a9cd3db85"} Oct 02 11:07:11 crc kubenswrapper[4835]: I1002 11:07:11.392073 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-lppfz" event={"ID":"dc8fd6c7-4ae9-49bf-b9ae-6e50882f670b","Type":"ContainerStarted","Data":"4c5c26c39a96a5b621d03bdc4ed789791ad1fd95eefa1c710a8fee8cf7f16b2b"} Oct 02 11:07:11 crc kubenswrapper[4835]: I1002 11:07:11.395138 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-zd9n5" event={"ID":"818c5e73-52cb-47f0-84b4-1931dd17f6e8","Type":"ContainerStarted","Data":"ffd9e3bb09c4c2b363650551545a2aec3c5dd3130a96fc18d65bc7c862d6d26c"} Oct 02 11:07:14 crc kubenswrapper[4835]: I1002 11:07:14.417426 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-lppfz" event={"ID":"dc8fd6c7-4ae9-49bf-b9ae-6e50882f670b","Type":"ContainerStarted","Data":"da94e0699cf5aaa0f4f294221a9819d5ae515774f00ad5bc10d38d6c3e41543f"} Oct 02 11:07:14 crc kubenswrapper[4835]: I1002 11:07:14.417981 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-lppfz" Oct 02 11:07:14 crc kubenswrapper[4835]: I1002 11:07:14.420891 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-zd9n5" event={"ID":"818c5e73-52cb-47f0-84b4-1931dd17f6e8","Type":"ContainerStarted","Data":"bcf27a4b4f6c6dd1b2760ba1efdf924d66ec6f14624e37af8a4374abfef577ac"} Oct 02 11:07:14 crc kubenswrapper[4835]: I1002 11:07:14.432879 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-lppfz" podStartSLOduration=1.7929842520000001 podStartE2EDuration="4.432862026s" podCreationTimestamp="2025-10-02 11:07:10 +0000 UTC" firstStartedPulling="2025-10-02 11:07:10.810141862 +0000 UTC m=+707.370049433" lastFinishedPulling="2025-10-02 11:07:13.450019626 +0000 UTC m=+710.009927207" observedRunningTime="2025-10-02 11:07:14.431571819 +0000 UTC m=+710.991479410" watchObservedRunningTime="2025-10-02 11:07:14.432862026 +0000 UTC m=+710.992769607" Oct 02 11:07:14 crc kubenswrapper[4835]: I1002 11:07:14.447136 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-zd9n5" podStartSLOduration=1.789139081 podStartE2EDuration="4.447117501s" podCreationTimestamp="2025-10-02 11:07:10 +0000 UTC" firstStartedPulling="2025-10-02 11:07:10.787878045 +0000 UTC m=+707.347785626" lastFinishedPulling="2025-10-02 11:07:13.445856465 +0000 UTC m=+710.005764046" observedRunningTime="2025-10-02 11:07:14.44640597 +0000 UTC m=+711.006313561" watchObservedRunningTime="2025-10-02 11:07:14.447117501 +0000 UTC m=+711.007025082" Oct 02 11:07:15 crc kubenswrapper[4835]: I1002 11:07:15.427068 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-k2pjv" event={"ID":"d962c289-233d-4788-a761-50ff86c59da8","Type":"ContainerStarted","Data":"ef82ef354662a601dd063203fbedd00442f2dbb0f613ed9b400af04177d8bb8e"} Oct 02 11:07:15 crc kubenswrapper[4835]: I1002 11:07:15.446349 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-k2pjv" podStartSLOduration=1.319417242 podStartE2EDuration="5.446325727s" podCreationTimestamp="2025-10-02 11:07:10 +0000 UTC" firstStartedPulling="2025-10-02 11:07:10.729557879 +0000 UTC m=+707.289465460" lastFinishedPulling="2025-10-02 11:07:14.856466364 +0000 UTC m=+711.416373945" observedRunningTime="2025-10-02 11:07:15.44265029 +0000 UTC m=+712.002557911" watchObservedRunningTime="2025-10-02 11:07:15.446325727 +0000 UTC m=+712.006233348" Oct 02 11:07:20 crc kubenswrapper[4835]: I1002 11:07:20.535986 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-lppfz" Oct 02 11:07:20 crc kubenswrapper[4835]: I1002 11:07:20.731384 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-79zgl"] Oct 02 11:07:20 crc kubenswrapper[4835]: I1002 11:07:20.732009 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovn-controller" containerID="cri-o://87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28" gracePeriod=30 Oct 02 11:07:20 crc kubenswrapper[4835]: I1002 11:07:20.732118 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="kube-rbac-proxy-node" containerID="cri-o://967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a" gracePeriod=30 Oct 02 11:07:20 crc kubenswrapper[4835]: I1002 11:07:20.732117 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="northd" containerID="cri-o://c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353" gracePeriod=30 Oct 02 11:07:20 crc kubenswrapper[4835]: I1002 11:07:20.732228 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="sbdb" containerID="cri-o://c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e" gracePeriod=30 Oct 02 11:07:20 crc kubenswrapper[4835]: I1002 11:07:20.732245 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovn-acl-logging" containerID="cri-o://28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9" gracePeriod=30 Oct 02 11:07:20 crc kubenswrapper[4835]: I1002 11:07:20.732321 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f" gracePeriod=30 Oct 02 11:07:20 crc kubenswrapper[4835]: I1002 11:07:20.732329 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="nbdb" containerID="cri-o://473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21" gracePeriod=30 Oct 02 11:07:20 crc kubenswrapper[4835]: I1002 11:07:20.803605 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" containerID="cri-o://d9744ccc8b6a5a64b87243892ec48b92624b70e6137acbb3b4a1839480650bcb" gracePeriod=30 Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.466213 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2tw4v_cea2edfd-8b9c-44be-be9a-d2feb410da71/kube-multus/2.log" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.466969 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2tw4v_cea2edfd-8b9c-44be-be9a-d2feb410da71/kube-multus/1.log" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.467008 4835 generic.go:334] "Generic (PLEG): container finished" podID="cea2edfd-8b9c-44be-be9a-d2feb410da71" containerID="0a6af3c25fd2b9444b9cb65bb4553e343f1f3d4362cac711a5c1a2252386b09b" exitCode=2 Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.467059 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2tw4v" event={"ID":"cea2edfd-8b9c-44be-be9a-d2feb410da71","Type":"ContainerDied","Data":"0a6af3c25fd2b9444b9cb65bb4553e343f1f3d4362cac711a5c1a2252386b09b"} Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.467098 4835 scope.go:117] "RemoveContainer" containerID="aa102fa35dfd7e69a238d8c7aa3d536c993b5070b7528089eb58722224c0a561" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.467636 4835 scope.go:117] "RemoveContainer" containerID="0a6af3c25fd2b9444b9cb65bb4553e343f1f3d4362cac711a5c1a2252386b09b" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.468020 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-2tw4v_openshift-multus(cea2edfd-8b9c-44be-be9a-d2feb410da71)\"" pod="openshift-multus/multus-2tw4v" podUID="cea2edfd-8b9c-44be-be9a-d2feb410da71" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.471267 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovnkube-controller/3.log" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.473655 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovn-acl-logging/0.log" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.474452 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovn-controller/0.log" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.474944 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="d9744ccc8b6a5a64b87243892ec48b92624b70e6137acbb3b4a1839480650bcb" exitCode=0 Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.474983 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e" exitCode=0 Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.474994 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21" exitCode=0 Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475004 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353" exitCode=0 Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475014 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f" exitCode=0 Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475022 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a" exitCode=0 Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475030 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9" exitCode=143 Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475038 4835 generic.go:334] "Generic (PLEG): container finished" podID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerID="87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28" exitCode=143 Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475064 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"d9744ccc8b6a5a64b87243892ec48b92624b70e6137acbb3b4a1839480650bcb"} Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475098 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e"} Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475113 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21"} Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475126 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353"} Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475137 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f"} Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475147 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a"} Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475158 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9"} Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.475169 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28"} Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.512927 4835 scope.go:117] "RemoveContainer" containerID="0804efb2cbae658949dbad61f3c9c30404f2ba0926e89053e90c07955250f6b3" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.763627 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovn-acl-logging/0.log" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.764186 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovn-controller/0.log" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.764791 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823144 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w5ns2"] Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823516 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovn-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823536 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovn-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823550 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="northd" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823556 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="northd" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823568 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823573 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823587 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823593 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823600 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823609 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823619 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="nbdb" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823625 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="nbdb" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823638 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="kubecfg-setup" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823644 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="kubecfg-setup" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823653 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="kube-rbac-proxy-node" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823659 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="kube-rbac-proxy-node" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823667 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovn-acl-logging" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823673 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovn-acl-logging" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823683 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="sbdb" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823689 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="sbdb" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823698 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823704 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823798 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823808 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823819 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovn-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823825 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823834 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovn-acl-logging" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823844 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823851 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823859 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="northd" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823868 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="kube-rbac-proxy-node" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823875 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="nbdb" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823883 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="sbdb" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.823989 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.823996 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: E1002 11:07:21.824004 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.824011 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.824137 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" containerName="ovnkube-controller" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.826167 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919090 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-netd\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919164 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-log-socket\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919199 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-ovn-kubernetes\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919242 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-ovn\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919263 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-kubelet\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919292 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-env-overrides\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919311 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-etc-openvswitch\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919340 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-var-lib-openvswitch\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919363 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-bin\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919387 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-node-log\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919407 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-script-lib\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919425 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-systemd-units\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919462 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919483 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-slash\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919503 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovn-node-metrics-cert\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919523 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsbdj\" (UniqueName: \"kubernetes.io/projected/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-kube-api-access-fsbdj\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919538 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-openvswitch\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919555 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-netns\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919579 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-config\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919567 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919624 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-log-socket" (OuterVolumeSpecName: "log-socket") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919604 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919784 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-slash" (OuterVolumeSpecName: "host-slash") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919783 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919810 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919652 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919696 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919726 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919837 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919839 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919818 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919858 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-node-log" (OuterVolumeSpecName: "node-log") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919606 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-systemd\") pod \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\" (UID: \"e1c2dc14-32fa-43fc-ae87-11d02eb3400a\") " Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.919919 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920021 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920162 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920338 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920649 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-cni-netd\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920716 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d07255be-42e1-462f-a2a9-df47fb9de900-env-overrides\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920740 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-cni-bin\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920776 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-run-ovn-kubernetes\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920798 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-var-lib-openvswitch\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920816 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-run-systemd\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920855 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-699wh\" (UniqueName: \"kubernetes.io/projected/d07255be-42e1-462f-a2a9-df47fb9de900-kube-api-access-699wh\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920877 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-log-socket\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920932 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d07255be-42e1-462f-a2a9-df47fb9de900-ovnkube-config\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.920992 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921015 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-node-log\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921211 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-run-netns\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921330 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-etc-openvswitch\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921398 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-slash\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921438 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d07255be-42e1-462f-a2a9-df47fb9de900-ovnkube-script-lib\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921477 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-run-openvswitch\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921510 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-kubelet\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921637 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d07255be-42e1-462f-a2a9-df47fb9de900-ovn-node-metrics-cert\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921684 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-systemd-units\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921713 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-run-ovn\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921817 4835 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921833 4835 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921843 4835 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-node-log\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921851 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921860 4835 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921871 4835 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921881 4835 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-slash\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921891 4835 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921900 4835 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921908 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921916 4835 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921924 4835 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-log-socket\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921932 4835 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921940 4835 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921948 4835 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921956 4835 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.921963 4835 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.925335 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-kube-api-access-fsbdj" (OuterVolumeSpecName: "kube-api-access-fsbdj") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "kube-api-access-fsbdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.925655 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:07:21 crc kubenswrapper[4835]: I1002 11:07:21.939340 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e1c2dc14-32fa-43fc-ae87-11d02eb3400a" (UID: "e1c2dc14-32fa-43fc-ae87-11d02eb3400a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023570 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-699wh\" (UniqueName: \"kubernetes.io/projected/d07255be-42e1-462f-a2a9-df47fb9de900-kube-api-access-699wh\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023633 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-log-socket\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023655 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d07255be-42e1-462f-a2a9-df47fb9de900-ovnkube-config\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023683 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023717 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-node-log\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023735 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-run-netns\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023762 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-etc-openvswitch\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023787 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-slash\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023803 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d07255be-42e1-462f-a2a9-df47fb9de900-ovnkube-script-lib\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023822 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-run-openvswitch\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023841 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-kubelet\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023831 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-node-log\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023863 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d07255be-42e1-462f-a2a9-df47fb9de900-ovn-node-metrics-cert\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023881 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-systemd-units\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023900 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-run-ovn\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023925 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-cni-netd\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023923 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024001 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-run-openvswitch\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024039 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-kubelet\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.023950 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d07255be-42e1-462f-a2a9-df47fb9de900-env-overrides\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024474 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-cni-bin\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024521 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-run-ovn-kubernetes\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024552 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-var-lib-openvswitch\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024578 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-run-systemd\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024670 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-etc-openvswitch\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024709 4835 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024736 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024753 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsbdj\" (UniqueName: \"kubernetes.io/projected/e1c2dc14-32fa-43fc-ae87-11d02eb3400a-kube-api-access-fsbdj\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024775 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-run-netns\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024791 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-run-systemd\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024815 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-cni-bin\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024815 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d07255be-42e1-462f-a2a9-df47fb9de900-env-overrides\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024849 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-log-socket\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024859 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-slash\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024879 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-run-ovn-kubernetes\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024896 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-var-lib-openvswitch\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024897 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d07255be-42e1-462f-a2a9-df47fb9de900-ovnkube-script-lib\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024889 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d07255be-42e1-462f-a2a9-df47fb9de900-ovnkube-config\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024911 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-run-ovn\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024942 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-host-cni-netd\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.024914 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d07255be-42e1-462f-a2a9-df47fb9de900-systemd-units\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.028347 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d07255be-42e1-462f-a2a9-df47fb9de900-ovn-node-metrics-cert\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.041702 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-699wh\" (UniqueName: \"kubernetes.io/projected/d07255be-42e1-462f-a2a9-df47fb9de900-kube-api-access-699wh\") pod \"ovnkube-node-w5ns2\" (UID: \"d07255be-42e1-462f-a2a9-df47fb9de900\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.140786 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:22 crc kubenswrapper[4835]: W1002 11:07:22.159624 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd07255be_42e1_462f_a2a9_df47fb9de900.slice/crio-34b93e3775f2496d69fdf4b169ef420d4abc34453b6d3efa5bd60023df48a07f WatchSource:0}: Error finding container 34b93e3775f2496d69fdf4b169ef420d4abc34453b6d3efa5bd60023df48a07f: Status 404 returned error can't find the container with id 34b93e3775f2496d69fdf4b169ef420d4abc34453b6d3efa5bd60023df48a07f Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.485780 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovn-acl-logging/0.log" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.486567 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79zgl_e1c2dc14-32fa-43fc-ae87-11d02eb3400a/ovn-controller/0.log" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.487207 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.487238 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79zgl" event={"ID":"e1c2dc14-32fa-43fc-ae87-11d02eb3400a","Type":"ContainerDied","Data":"cd8fd04bf7f63162a8b7df6748366eb034afcb8e5577d5632c8151c20137d1d4"} Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.487447 4835 scope.go:117] "RemoveContainer" containerID="d9744ccc8b6a5a64b87243892ec48b92624b70e6137acbb3b4a1839480650bcb" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.489367 4835 generic.go:334] "Generic (PLEG): container finished" podID="d07255be-42e1-462f-a2a9-df47fb9de900" containerID="d96a65b88bfa23be738c0c515c9aaaed89d24547a8a0824f9abc1dbf41caed6c" exitCode=0 Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.489443 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" event={"ID":"d07255be-42e1-462f-a2a9-df47fb9de900","Type":"ContainerDied","Data":"d96a65b88bfa23be738c0c515c9aaaed89d24547a8a0824f9abc1dbf41caed6c"} Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.489478 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" event={"ID":"d07255be-42e1-462f-a2a9-df47fb9de900","Type":"ContainerStarted","Data":"34b93e3775f2496d69fdf4b169ef420d4abc34453b6d3efa5bd60023df48a07f"} Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.490980 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2tw4v_cea2edfd-8b9c-44be-be9a-d2feb410da71/kube-multus/2.log" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.510312 4835 scope.go:117] "RemoveContainer" containerID="c50585a895ace03d1ad112b6997b521e139a854d41cd111d4185d6e89dde787e" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.531651 4835 scope.go:117] "RemoveContainer" containerID="473f3f9667bb26afa2902bf040bd385b102ac43939ae999945c1b49eeabf4f21" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.544771 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-79zgl"] Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.546392 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-79zgl"] Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.553110 4835 scope.go:117] "RemoveContainer" containerID="c04120e6d34c2c20d74f5744a890dbcefafdc4fd32f2af2a371f14e399292353" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.577210 4835 scope.go:117] "RemoveContainer" containerID="f03e630b4c4504da7ad21bca4d60e612749f0d92b1c23e12e35fbb745f3a054f" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.589637 4835 scope.go:117] "RemoveContainer" containerID="967b7cb7449571d17fcf7d7de0f2c6ded5864a81f8befdc8e4e6d5276f466c3a" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.603034 4835 scope.go:117] "RemoveContainer" containerID="28cb44bd52b1f93d5fd114e55fa91e9323fb279452f4f2330002451fb00e0eb9" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.616012 4835 scope.go:117] "RemoveContainer" containerID="87acde32da040ee027439bff6be1451dbd5ecc191c7c78c5c0d798b214e0ab28" Oct 02 11:07:22 crc kubenswrapper[4835]: I1002 11:07:22.634579 4835 scope.go:117] "RemoveContainer" containerID="0148efab06ffcf27a49f564dffc8268b6737b8d5a07c9d9d33609baf5c76d803" Oct 02 11:07:23 crc kubenswrapper[4835]: I1002 11:07:23.502984 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" event={"ID":"d07255be-42e1-462f-a2a9-df47fb9de900","Type":"ContainerStarted","Data":"cb81d0c71091511300adf8c1e9c616dad32fbc2d74a01b598fa2a466d92f2a02"} Oct 02 11:07:23 crc kubenswrapper[4835]: I1002 11:07:23.503314 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" event={"ID":"d07255be-42e1-462f-a2a9-df47fb9de900","Type":"ContainerStarted","Data":"bb936226463038d1cd3f242401f8e29a36c1ae3b65cc95321d059e7f042d35e7"} Oct 02 11:07:23 crc kubenswrapper[4835]: I1002 11:07:23.503325 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" event={"ID":"d07255be-42e1-462f-a2a9-df47fb9de900","Type":"ContainerStarted","Data":"aada602904f625a91c5f6adcb010a1a2732c02de5e077ed00de756c9f36f9865"} Oct 02 11:07:23 crc kubenswrapper[4835]: I1002 11:07:23.503334 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" event={"ID":"d07255be-42e1-462f-a2a9-df47fb9de900","Type":"ContainerStarted","Data":"d2c110d16c62d4942a08aac11eaee164eaf11bcf2e1660a386e42f901a487d0d"} Oct 02 11:07:23 crc kubenswrapper[4835]: I1002 11:07:23.503345 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" event={"ID":"d07255be-42e1-462f-a2a9-df47fb9de900","Type":"ContainerStarted","Data":"cac0c08f3949d1ed9a1af3b8d66427634a04a10f2743b499e7f0ebf053ac5a02"} Oct 02 11:07:23 crc kubenswrapper[4835]: I1002 11:07:23.503357 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" event={"ID":"d07255be-42e1-462f-a2a9-df47fb9de900","Type":"ContainerStarted","Data":"4de14f54bb3eae64b3854e41a32f4d12091b0f859e6d933b2eb85f059f4ecc3f"} Oct 02 11:07:24 crc kubenswrapper[4835]: I1002 11:07:24.266272 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c2dc14-32fa-43fc-ae87-11d02eb3400a" path="/var/lib/kubelet/pods/e1c2dc14-32fa-43fc-ae87-11d02eb3400a/volumes" Oct 02 11:07:25 crc kubenswrapper[4835]: I1002 11:07:25.517786 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" event={"ID":"d07255be-42e1-462f-a2a9-df47fb9de900","Type":"ContainerStarted","Data":"8d5aaf1d4d3e98e1a719dbe49eb0704523810f1d52a08f912cd82ae97ff7f94f"} Oct 02 11:07:28 crc kubenswrapper[4835]: I1002 11:07:28.543435 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" event={"ID":"d07255be-42e1-462f-a2a9-df47fb9de900","Type":"ContainerStarted","Data":"91e6bd2dc25c857b8a3eeb84a4981d1c6dced6931e6cf26966434574f5f90b00"} Oct 02 11:07:28 crc kubenswrapper[4835]: I1002 11:07:28.543982 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:28 crc kubenswrapper[4835]: I1002 11:07:28.544010 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:28 crc kubenswrapper[4835]: I1002 11:07:28.576309 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:28 crc kubenswrapper[4835]: I1002 11:07:28.582601 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" podStartSLOduration=7.582583052 podStartE2EDuration="7.582583052s" podCreationTimestamp="2025-10-02 11:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:07:28.579331547 +0000 UTC m=+725.139239158" watchObservedRunningTime="2025-10-02 11:07:28.582583052 +0000 UTC m=+725.142490633" Oct 02 11:07:29 crc kubenswrapper[4835]: I1002 11:07:29.552348 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:29 crc kubenswrapper[4835]: I1002 11:07:29.635824 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:34 crc kubenswrapper[4835]: I1002 11:07:34.255790 4835 scope.go:117] "RemoveContainer" containerID="0a6af3c25fd2b9444b9cb65bb4553e343f1f3d4362cac711a5c1a2252386b09b" Oct 02 11:07:34 crc kubenswrapper[4835]: E1002 11:07:34.256074 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-2tw4v_openshift-multus(cea2edfd-8b9c-44be-be9a-d2feb410da71)\"" pod="openshift-multus/multus-2tw4v" podUID="cea2edfd-8b9c-44be-be9a-d2feb410da71" Oct 02 11:07:49 crc kubenswrapper[4835]: I1002 11:07:49.252448 4835 scope.go:117] "RemoveContainer" containerID="0a6af3c25fd2b9444b9cb65bb4553e343f1f3d4362cac711a5c1a2252386b09b" Oct 02 11:07:49 crc kubenswrapper[4835]: I1002 11:07:49.679674 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2tw4v_cea2edfd-8b9c-44be-be9a-d2feb410da71/kube-multus/2.log" Oct 02 11:07:49 crc kubenswrapper[4835]: I1002 11:07:49.680068 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2tw4v" event={"ID":"cea2edfd-8b9c-44be-be9a-d2feb410da71","Type":"ContainerStarted","Data":"0901fdf5cdbbe77aec706010aa8c2e490ff8ecf713c974fedc804b2bc12cc2af"} Oct 02 11:07:52 crc kubenswrapper[4835]: I1002 11:07:52.174668 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5ns2" Oct 02 11:07:57 crc kubenswrapper[4835]: I1002 11:07:57.823424 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zl6kd"] Oct 02 11:07:57 crc kubenswrapper[4835]: I1002 11:07:57.824175 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" podUID="73f4901e-a2c3-474d-8d52-972b775c2017" containerName="controller-manager" containerID="cri-o://c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2" gracePeriod=30 Oct 02 11:07:57 crc kubenswrapper[4835]: I1002 11:07:57.880565 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp"] Oct 02 11:07:57 crc kubenswrapper[4835]: I1002 11:07:57.880784 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" podUID="3da125d5-132a-44df-ba26-fd6305dabcdc" containerName="route-controller-manager" containerID="cri-o://115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21" gracePeriod=30 Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.472883 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.544911 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r567v\" (UniqueName: \"kubernetes.io/projected/3da125d5-132a-44df-ba26-fd6305dabcdc-kube-api-access-r567v\") pod \"3da125d5-132a-44df-ba26-fd6305dabcdc\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.544962 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-client-ca\") pod \"3da125d5-132a-44df-ba26-fd6305dabcdc\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.544983 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da125d5-132a-44df-ba26-fd6305dabcdc-serving-cert\") pod \"3da125d5-132a-44df-ba26-fd6305dabcdc\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.545037 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-config\") pod \"3da125d5-132a-44df-ba26-fd6305dabcdc\" (UID: \"3da125d5-132a-44df-ba26-fd6305dabcdc\") " Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.545720 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-client-ca" (OuterVolumeSpecName: "client-ca") pod "3da125d5-132a-44df-ba26-fd6305dabcdc" (UID: "3da125d5-132a-44df-ba26-fd6305dabcdc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.545799 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-config" (OuterVolumeSpecName: "config") pod "3da125d5-132a-44df-ba26-fd6305dabcdc" (UID: "3da125d5-132a-44df-ba26-fd6305dabcdc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.550991 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da125d5-132a-44df-ba26-fd6305dabcdc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3da125d5-132a-44df-ba26-fd6305dabcdc" (UID: "3da125d5-132a-44df-ba26-fd6305dabcdc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.552818 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da125d5-132a-44df-ba26-fd6305dabcdc-kube-api-access-r567v" (OuterVolumeSpecName: "kube-api-access-r567v") pod "3da125d5-132a-44df-ba26-fd6305dabcdc" (UID: "3da125d5-132a-44df-ba26-fd6305dabcdc"). InnerVolumeSpecName "kube-api-access-r567v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.646657 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.646695 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r567v\" (UniqueName: \"kubernetes.io/projected/3da125d5-132a-44df-ba26-fd6305dabcdc-kube-api-access-r567v\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.646711 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3da125d5-132a-44df-ba26-fd6305dabcdc-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.646727 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da125d5-132a-44df-ba26-fd6305dabcdc-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.734214 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.738253 4835 generic.go:334] "Generic (PLEG): container finished" podID="3da125d5-132a-44df-ba26-fd6305dabcdc" containerID="115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21" exitCode=0 Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.738340 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.738342 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" event={"ID":"3da125d5-132a-44df-ba26-fd6305dabcdc","Type":"ContainerDied","Data":"115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21"} Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.739345 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp" event={"ID":"3da125d5-132a-44df-ba26-fd6305dabcdc","Type":"ContainerDied","Data":"66125fd7bb1b943f57a2bad98d19dfad2552cd2c4726a6ca062b00914b88bde5"} Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.739378 4835 scope.go:117] "RemoveContainer" containerID="115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.742304 4835 generic.go:334] "Generic (PLEG): container finished" podID="73f4901e-a2c3-474d-8d52-972b775c2017" containerID="c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2" exitCode=0 Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.742348 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" event={"ID":"73f4901e-a2c3-474d-8d52-972b775c2017","Type":"ContainerDied","Data":"c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2"} Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.742377 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" event={"ID":"73f4901e-a2c3-474d-8d52-972b775c2017","Type":"ContainerDied","Data":"1be9255f09240e9ff3743362e9e389fe9d286273b6fba38be644ae5ba1eb5676"} Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.742442 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zl6kd" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.747518 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-proxy-ca-bundles\") pod \"73f4901e-a2c3-474d-8d52-972b775c2017\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.747667 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73f4901e-a2c3-474d-8d52-972b775c2017-serving-cert\") pod \"73f4901e-a2c3-474d-8d52-972b775c2017\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.747759 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-client-ca\") pod \"73f4901e-a2c3-474d-8d52-972b775c2017\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.747844 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-config\") pod \"73f4901e-a2c3-474d-8d52-972b775c2017\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.747934 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdf55\" (UniqueName: \"kubernetes.io/projected/73f4901e-a2c3-474d-8d52-972b775c2017-kube-api-access-kdf55\") pod \"73f4901e-a2c3-474d-8d52-972b775c2017\" (UID: \"73f4901e-a2c3-474d-8d52-972b775c2017\") " Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.749900 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "73f4901e-a2c3-474d-8d52-972b775c2017" (UID: "73f4901e-a2c3-474d-8d52-972b775c2017"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.750058 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-config" (OuterVolumeSpecName: "config") pod "73f4901e-a2c3-474d-8d52-972b775c2017" (UID: "73f4901e-a2c3-474d-8d52-972b775c2017"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.750568 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-client-ca" (OuterVolumeSpecName: "client-ca") pod "73f4901e-a2c3-474d-8d52-972b775c2017" (UID: "73f4901e-a2c3-474d-8d52-972b775c2017"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.753633 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f4901e-a2c3-474d-8d52-972b775c2017-kube-api-access-kdf55" (OuterVolumeSpecName: "kube-api-access-kdf55") pod "73f4901e-a2c3-474d-8d52-972b775c2017" (UID: "73f4901e-a2c3-474d-8d52-972b775c2017"). InnerVolumeSpecName "kube-api-access-kdf55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.754870 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f4901e-a2c3-474d-8d52-972b775c2017-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "73f4901e-a2c3-474d-8d52-972b775c2017" (UID: "73f4901e-a2c3-474d-8d52-972b775c2017"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.779412 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp"] Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.779611 4835 scope.go:117] "RemoveContainer" containerID="115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21" Oct 02 11:07:58 crc kubenswrapper[4835]: E1002 11:07:58.781442 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21\": container with ID starting with 115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21 not found: ID does not exist" containerID="115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.781476 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21"} err="failed to get container status \"115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21\": rpc error: code = NotFound desc = could not find container \"115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21\": container with ID starting with 115ac0617d2441ca25da96ae456601f869319152c0e1b790ac919a56f38b9c21 not found: ID does not exist" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.781513 4835 scope.go:117] "RemoveContainer" containerID="c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.786894 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-shflp"] Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.799871 4835 scope.go:117] "RemoveContainer" containerID="c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2" Oct 02 11:07:58 crc kubenswrapper[4835]: E1002 11:07:58.800433 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2\": container with ID starting with c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2 not found: ID does not exist" containerID="c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.800500 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2"} err="failed to get container status \"c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2\": rpc error: code = NotFound desc = could not find container \"c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2\": container with ID starting with c1de022dbb21d05d0ce74263daf3c1fb2f46790053bb2d7253451cdd771785a2 not found: ID does not exist" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.848712 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.848754 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73f4901e-a2c3-474d-8d52-972b775c2017-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.848764 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.848774 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f4901e-a2c3-474d-8d52-972b775c2017-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:58 crc kubenswrapper[4835]: I1002 11:07:58.848784 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdf55\" (UniqueName: \"kubernetes.io/projected/73f4901e-a2c3-474d-8d52-972b775c2017-kube-api-access-kdf55\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.068334 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zl6kd"] Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.070895 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zl6kd"] Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.390607 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m"] Oct 02 11:07:59 crc kubenswrapper[4835]: E1002 11:07:59.390872 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da125d5-132a-44df-ba26-fd6305dabcdc" containerName="route-controller-manager" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.390889 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da125d5-132a-44df-ba26-fd6305dabcdc" containerName="route-controller-manager" Oct 02 11:07:59 crc kubenswrapper[4835]: E1002 11:07:59.390910 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f4901e-a2c3-474d-8d52-972b775c2017" containerName="controller-manager" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.390918 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f4901e-a2c3-474d-8d52-972b775c2017" containerName="controller-manager" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.391030 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da125d5-132a-44df-ba26-fd6305dabcdc" containerName="route-controller-manager" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.391056 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f4901e-a2c3-474d-8d52-972b775c2017" containerName="controller-manager" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.391893 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.393893 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.404416 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m"] Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.461603 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.461665 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lblr9\" (UniqueName: \"kubernetes.io/projected/e133539a-9cba-48c1-896a-fb04fd1b3c14-kube-api-access-lblr9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.461911 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.562955 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.563025 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.563053 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lblr9\" (UniqueName: \"kubernetes.io/projected/e133539a-9cba-48c1-896a-fb04fd1b3c14-kube-api-access-lblr9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.563645 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.563754 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.584836 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lblr9\" (UniqueName: \"kubernetes.io/projected/e133539a-9cba-48c1-896a-fb04fd1b3c14-kube-api-access-lblr9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.587398 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54c4b54d4c-8nggm"] Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.588232 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.588861 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz"] Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.589570 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.590192 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.590384 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.591355 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.591421 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.592951 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.592968 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.593069 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.593395 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.593421 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.593846 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.595508 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.600921 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.600641 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54c4b54d4c-8nggm"] Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.602603 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.604493 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz"] Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.707889 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.765337 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bd3e8a4-1921-4945-b721-9e4e3091858e-config\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.765395 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7073a307-29c1-40ce-9eb2-b77f67c7c11b-config\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.765443 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7073a307-29c1-40ce-9eb2-b77f67c7c11b-client-ca\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.765465 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd3e8a4-1921-4945-b721-9e4e3091858e-serving-cert\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.765481 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7073a307-29c1-40ce-9eb2-b77f67c7c11b-serving-cert\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.765518 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd3e8a4-1921-4945-b721-9e4e3091858e-proxy-ca-bundles\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.765773 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6mph\" (UniqueName: \"kubernetes.io/projected/7073a307-29c1-40ce-9eb2-b77f67c7c11b-kube-api-access-s6mph\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.765982 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm44c\" (UniqueName: \"kubernetes.io/projected/0bd3e8a4-1921-4945-b721-9e4e3091858e-kube-api-access-jm44c\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.766074 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bd3e8a4-1921-4945-b721-9e4e3091858e-client-ca\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.868814 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7073a307-29c1-40ce-9eb2-b77f67c7c11b-client-ca\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.869351 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd3e8a4-1921-4945-b721-9e4e3091858e-serving-cert\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.869797 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7073a307-29c1-40ce-9eb2-b77f67c7c11b-serving-cert\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.870750 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7073a307-29c1-40ce-9eb2-b77f67c7c11b-client-ca\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.871459 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd3e8a4-1921-4945-b721-9e4e3091858e-proxy-ca-bundles\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.869953 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd3e8a4-1921-4945-b721-9e4e3091858e-proxy-ca-bundles\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.871582 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6mph\" (UniqueName: \"kubernetes.io/projected/7073a307-29c1-40ce-9eb2-b77f67c7c11b-kube-api-access-s6mph\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.871625 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm44c\" (UniqueName: \"kubernetes.io/projected/0bd3e8a4-1921-4945-b721-9e4e3091858e-kube-api-access-jm44c\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.871645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bd3e8a4-1921-4945-b721-9e4e3091858e-client-ca\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.871695 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bd3e8a4-1921-4945-b721-9e4e3091858e-config\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.871753 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7073a307-29c1-40ce-9eb2-b77f67c7c11b-config\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.873072 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bd3e8a4-1921-4945-b721-9e4e3091858e-client-ca\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.873505 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bd3e8a4-1921-4945-b721-9e4e3091858e-config\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.874362 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7073a307-29c1-40ce-9eb2-b77f67c7c11b-serving-cert\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.878751 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7073a307-29c1-40ce-9eb2-b77f67c7c11b-config\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.883201 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd3e8a4-1921-4945-b721-9e4e3091858e-serving-cert\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.891062 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6mph\" (UniqueName: \"kubernetes.io/projected/7073a307-29c1-40ce-9eb2-b77f67c7c11b-kube-api-access-s6mph\") pod \"route-controller-manager-6f99d55b96-vq7dz\" (UID: \"7073a307-29c1-40ce-9eb2-b77f67c7c11b\") " pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.894543 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm44c\" (UniqueName: \"kubernetes.io/projected/0bd3e8a4-1921-4945-b721-9e4e3091858e-kube-api-access-jm44c\") pod \"controller-manager-54c4b54d4c-8nggm\" (UID: \"0bd3e8a4-1921-4945-b721-9e4e3091858e\") " pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.906369 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m"] Oct 02 11:07:59 crc kubenswrapper[4835]: W1002 11:07:59.912837 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode133539a_9cba_48c1_896a_fb04fd1b3c14.slice/crio-4f22cb3c4f4d0d6d8987c6dd4fd727b1e3bd7a03006647eb24943b9520cd6fcc WatchSource:0}: Error finding container 4f22cb3c4f4d0d6d8987c6dd4fd727b1e3bd7a03006647eb24943b9520cd6fcc: Status 404 returned error can't find the container with id 4f22cb3c4f4d0d6d8987c6dd4fd727b1e3bd7a03006647eb24943b9520cd6fcc Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.919908 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:07:59 crc kubenswrapper[4835]: I1002 11:07:59.933649 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.134852 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54c4b54d4c-8nggm"] Oct 02 11:08:00 crc kubenswrapper[4835]: W1002 11:08:00.135611 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd3e8a4_1921_4945_b721_9e4e3091858e.slice/crio-4eac54afba97388d2ac26fe0c07cdd0394dd8dee685377d68f6d94b3922e4419 WatchSource:0}: Error finding container 4eac54afba97388d2ac26fe0c07cdd0394dd8dee685377d68f6d94b3922e4419: Status 404 returned error can't find the container with id 4eac54afba97388d2ac26fe0c07cdd0394dd8dee685377d68f6d94b3922e4419 Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.184410 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz"] Oct 02 11:08:00 crc kubenswrapper[4835]: W1002 11:08:00.218936 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7073a307_29c1_40ce_9eb2_b77f67c7c11b.slice/crio-05e6a1cacac22a529c49114cff26f1876b965c43d249329ad92ffee90db42aee WatchSource:0}: Error finding container 05e6a1cacac22a529c49114cff26f1876b965c43d249329ad92ffee90db42aee: Status 404 returned error can't find the container with id 05e6a1cacac22a529c49114cff26f1876b965c43d249329ad92ffee90db42aee Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.260466 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da125d5-132a-44df-ba26-fd6305dabcdc" path="/var/lib/kubelet/pods/3da125d5-132a-44df-ba26-fd6305dabcdc/volumes" Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.261304 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f4901e-a2c3-474d-8d52-972b775c2017" path="/var/lib/kubelet/pods/73f4901e-a2c3-474d-8d52-972b775c2017/volumes" Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.757987 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" event={"ID":"7073a307-29c1-40ce-9eb2-b77f67c7c11b","Type":"ContainerStarted","Data":"f35185e8324c4c1a0080a7da4aaad1585b547e806683a5937c3881bdc36045a0"} Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.758030 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" event={"ID":"7073a307-29c1-40ce-9eb2-b77f67c7c11b","Type":"ContainerStarted","Data":"05e6a1cacac22a529c49114cff26f1876b965c43d249329ad92ffee90db42aee"} Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.758257 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.759772 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" event={"ID":"0bd3e8a4-1921-4945-b721-9e4e3091858e","Type":"ContainerStarted","Data":"8ec6ad4811b48aac238573bdee60f57d7ed263b95e24798a9ba803a87c158a9b"} Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.759801 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" event={"ID":"0bd3e8a4-1921-4945-b721-9e4e3091858e","Type":"ContainerStarted","Data":"4eac54afba97388d2ac26fe0c07cdd0394dd8dee685377d68f6d94b3922e4419"} Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.760174 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.761371 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" event={"ID":"e133539a-9cba-48c1-896a-fb04fd1b3c14","Type":"ContainerStarted","Data":"41048be23ea024ad34eb811031c1f9683e3e314ecb3853c71c59f2190ce4345a"} Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.761402 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" event={"ID":"e133539a-9cba-48c1-896a-fb04fd1b3c14","Type":"ContainerStarted","Data":"4f22cb3c4f4d0d6d8987c6dd4fd727b1e3bd7a03006647eb24943b9520cd6fcc"} Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.765309 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.783860 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" podStartSLOduration=2.783839526 podStartE2EDuration="2.783839526s" podCreationTimestamp="2025-10-02 11:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:08:00.779401997 +0000 UTC m=+757.339309578" watchObservedRunningTime="2025-10-02 11:08:00.783839526 +0000 UTC m=+757.343747107" Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.820736 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54c4b54d4c-8nggm" podStartSLOduration=2.820711055 podStartE2EDuration="2.820711055s" podCreationTimestamp="2025-10-02 11:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:08:00.816570204 +0000 UTC m=+757.376477785" watchObservedRunningTime="2025-10-02 11:08:00.820711055 +0000 UTC m=+757.380618626" Oct 02 11:08:00 crc kubenswrapper[4835]: I1002 11:08:00.991092 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f99d55b96-vq7dz" Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.445073 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h2c9t"] Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.447207 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.450211 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h2c9t"] Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.491865 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-utilities\") pod \"redhat-operators-h2c9t\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.492035 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-catalog-content\") pod \"redhat-operators-h2c9t\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.492075 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjws8\" (UniqueName: \"kubernetes.io/projected/a1410144-8ec7-453e-a307-c1b03d17056f-kube-api-access-gjws8\") pod \"redhat-operators-h2c9t\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.593973 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-catalog-content\") pod \"redhat-operators-h2c9t\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.594041 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjws8\" (UniqueName: \"kubernetes.io/projected/a1410144-8ec7-453e-a307-c1b03d17056f-kube-api-access-gjws8\") pod \"redhat-operators-h2c9t\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.594108 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-utilities\") pod \"redhat-operators-h2c9t\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.594671 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-catalog-content\") pod \"redhat-operators-h2c9t\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.594888 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-utilities\") pod \"redhat-operators-h2c9t\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.629153 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjws8\" (UniqueName: \"kubernetes.io/projected/a1410144-8ec7-453e-a307-c1b03d17056f-kube-api-access-gjws8\") pod \"redhat-operators-h2c9t\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.769112 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" event={"ID":"e133539a-9cba-48c1-896a-fb04fd1b3c14","Type":"ContainerDied","Data":"41048be23ea024ad34eb811031c1f9683e3e314ecb3853c71c59f2190ce4345a"} Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.769119 4835 generic.go:334] "Generic (PLEG): container finished" podID="e133539a-9cba-48c1-896a-fb04fd1b3c14" containerID="41048be23ea024ad34eb811031c1f9683e3e314ecb3853c71c59f2190ce4345a" exitCode=0 Oct 02 11:08:01 crc kubenswrapper[4835]: I1002 11:08:01.773273 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:02 crc kubenswrapper[4835]: I1002 11:08:02.250772 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h2c9t"] Oct 02 11:08:02 crc kubenswrapper[4835]: W1002 11:08:02.265490 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1410144_8ec7_453e_a307_c1b03d17056f.slice/crio-86ea7877a17cecacdb48f188818c7f131ccf77e4da900bee596798bfbaee3b28 WatchSource:0}: Error finding container 86ea7877a17cecacdb48f188818c7f131ccf77e4da900bee596798bfbaee3b28: Status 404 returned error can't find the container with id 86ea7877a17cecacdb48f188818c7f131ccf77e4da900bee596798bfbaee3b28 Oct 02 11:08:02 crc kubenswrapper[4835]: I1002 11:08:02.775755 4835 generic.go:334] "Generic (PLEG): container finished" podID="a1410144-8ec7-453e-a307-c1b03d17056f" containerID="a208642666dd6333813d4cd19b7c875919ceba7adb0e6e30ebde1b6aa3341a6b" exitCode=0 Oct 02 11:08:02 crc kubenswrapper[4835]: I1002 11:08:02.775876 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2c9t" event={"ID":"a1410144-8ec7-453e-a307-c1b03d17056f","Type":"ContainerDied","Data":"a208642666dd6333813d4cd19b7c875919ceba7adb0e6e30ebde1b6aa3341a6b"} Oct 02 11:08:02 crc kubenswrapper[4835]: I1002 11:08:02.776400 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2c9t" event={"ID":"a1410144-8ec7-453e-a307-c1b03d17056f","Type":"ContainerStarted","Data":"86ea7877a17cecacdb48f188818c7f131ccf77e4da900bee596798bfbaee3b28"} Oct 02 11:08:04 crc kubenswrapper[4835]: I1002 11:08:04.790576 4835 generic.go:334] "Generic (PLEG): container finished" podID="e133539a-9cba-48c1-896a-fb04fd1b3c14" containerID="7b292db34fea2f89af5372fa4358ef0ad9cd0bfa7437af736a140b6914ac4b2d" exitCode=0 Oct 02 11:08:04 crc kubenswrapper[4835]: I1002 11:08:04.790949 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" event={"ID":"e133539a-9cba-48c1-896a-fb04fd1b3c14","Type":"ContainerDied","Data":"7b292db34fea2f89af5372fa4358ef0ad9cd0bfa7437af736a140b6914ac4b2d"} Oct 02 11:08:04 crc kubenswrapper[4835]: I1002 11:08:04.794430 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2c9t" event={"ID":"a1410144-8ec7-453e-a307-c1b03d17056f","Type":"ContainerDied","Data":"1e7867341ccc9602467ab491818a8cbe8c2e2ebee1e29f1d9c755971493922f5"} Oct 02 11:08:04 crc kubenswrapper[4835]: I1002 11:08:04.794286 4835 generic.go:334] "Generic (PLEG): container finished" podID="a1410144-8ec7-453e-a307-c1b03d17056f" containerID="1e7867341ccc9602467ab491818a8cbe8c2e2ebee1e29f1d9c755971493922f5" exitCode=0 Oct 02 11:08:05 crc kubenswrapper[4835]: I1002 11:08:05.351024 4835 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 11:08:05 crc kubenswrapper[4835]: I1002 11:08:05.804141 4835 generic.go:334] "Generic (PLEG): container finished" podID="e133539a-9cba-48c1-896a-fb04fd1b3c14" containerID="4a693236ebef00091f23910bba6bd6f248d3b859de132c9f791477f4f1b8b5e3" exitCode=0 Oct 02 11:08:05 crc kubenswrapper[4835]: I1002 11:08:05.804188 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" event={"ID":"e133539a-9cba-48c1-896a-fb04fd1b3c14","Type":"ContainerDied","Data":"4a693236ebef00091f23910bba6bd6f248d3b859de132c9f791477f4f1b8b5e3"} Oct 02 11:08:06 crc kubenswrapper[4835]: I1002 11:08:06.811970 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2c9t" event={"ID":"a1410144-8ec7-453e-a307-c1b03d17056f","Type":"ContainerStarted","Data":"f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172"} Oct 02 11:08:06 crc kubenswrapper[4835]: I1002 11:08:06.830082 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h2c9t" podStartSLOduration=3.2090707529999998 podStartE2EDuration="5.830054103s" podCreationTimestamp="2025-10-02 11:08:01 +0000 UTC" firstStartedPulling="2025-10-02 11:08:02.778118297 +0000 UTC m=+759.338025878" lastFinishedPulling="2025-10-02 11:08:05.399101647 +0000 UTC m=+761.959009228" observedRunningTime="2025-10-02 11:08:06.827821498 +0000 UTC m=+763.387729119" watchObservedRunningTime="2025-10-02 11:08:06.830054103 +0000 UTC m=+763.389961714" Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.132124 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.184630 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-util\") pod \"e133539a-9cba-48c1-896a-fb04fd1b3c14\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.184758 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-bundle\") pod \"e133539a-9cba-48c1-896a-fb04fd1b3c14\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.184798 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lblr9\" (UniqueName: \"kubernetes.io/projected/e133539a-9cba-48c1-896a-fb04fd1b3c14-kube-api-access-lblr9\") pod \"e133539a-9cba-48c1-896a-fb04fd1b3c14\" (UID: \"e133539a-9cba-48c1-896a-fb04fd1b3c14\") " Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.186557 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-bundle" (OuterVolumeSpecName: "bundle") pod "e133539a-9cba-48c1-896a-fb04fd1b3c14" (UID: "e133539a-9cba-48c1-896a-fb04fd1b3c14"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.196744 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-util" (OuterVolumeSpecName: "util") pod "e133539a-9cba-48c1-896a-fb04fd1b3c14" (UID: "e133539a-9cba-48c1-896a-fb04fd1b3c14"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.203341 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e133539a-9cba-48c1-896a-fb04fd1b3c14-kube-api-access-lblr9" (OuterVolumeSpecName: "kube-api-access-lblr9") pod "e133539a-9cba-48c1-896a-fb04fd1b3c14" (UID: "e133539a-9cba-48c1-896a-fb04fd1b3c14"). InnerVolumeSpecName "kube-api-access-lblr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.286259 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.286290 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lblr9\" (UniqueName: \"kubernetes.io/projected/e133539a-9cba-48c1-896a-fb04fd1b3c14-kube-api-access-lblr9\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.286301 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e133539a-9cba-48c1-896a-fb04fd1b3c14-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.818636 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" event={"ID":"e133539a-9cba-48c1-896a-fb04fd1b3c14","Type":"ContainerDied","Data":"4f22cb3c4f4d0d6d8987c6dd4fd727b1e3bd7a03006647eb24943b9520cd6fcc"} Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.818677 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m" Oct 02 11:08:07 crc kubenswrapper[4835]: I1002 11:08:07.818689 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f22cb3c4f4d0d6d8987c6dd4fd727b1e3bd7a03006647eb24943b9520cd6fcc" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.736568 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-nrg4w"] Oct 02 11:08:09 crc kubenswrapper[4835]: E1002 11:08:09.737068 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e133539a-9cba-48c1-896a-fb04fd1b3c14" containerName="util" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.737079 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e133539a-9cba-48c1-896a-fb04fd1b3c14" containerName="util" Oct 02 11:08:09 crc kubenswrapper[4835]: E1002 11:08:09.737098 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e133539a-9cba-48c1-896a-fb04fd1b3c14" containerName="extract" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.737104 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e133539a-9cba-48c1-896a-fb04fd1b3c14" containerName="extract" Oct 02 11:08:09 crc kubenswrapper[4835]: E1002 11:08:09.737123 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e133539a-9cba-48c1-896a-fb04fd1b3c14" containerName="pull" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.737129 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e133539a-9cba-48c1-896a-fb04fd1b3c14" containerName="pull" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.737244 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e133539a-9cba-48c1-896a-fb04fd1b3c14" containerName="extract" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.737651 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-nrg4w" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.743200 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-28ng9" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.744444 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.750487 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.760453 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-nrg4w"] Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.819379 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxcdk\" (UniqueName: \"kubernetes.io/projected/6d38f1a8-cd4c-4e77-905b-0480f95167d6-kube-api-access-kxcdk\") pod \"nmstate-operator-858ddd8f98-nrg4w\" (UID: \"6d38f1a8-cd4c-4e77-905b-0480f95167d6\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-nrg4w" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.920771 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxcdk\" (UniqueName: \"kubernetes.io/projected/6d38f1a8-cd4c-4e77-905b-0480f95167d6-kube-api-access-kxcdk\") pod \"nmstate-operator-858ddd8f98-nrg4w\" (UID: \"6d38f1a8-cd4c-4e77-905b-0480f95167d6\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-nrg4w" Oct 02 11:08:09 crc kubenswrapper[4835]: I1002 11:08:09.941752 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxcdk\" (UniqueName: \"kubernetes.io/projected/6d38f1a8-cd4c-4e77-905b-0480f95167d6-kube-api-access-kxcdk\") pod \"nmstate-operator-858ddd8f98-nrg4w\" (UID: \"6d38f1a8-cd4c-4e77-905b-0480f95167d6\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-nrg4w" Oct 02 11:08:10 crc kubenswrapper[4835]: I1002 11:08:10.053904 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-nrg4w" Oct 02 11:08:10 crc kubenswrapper[4835]: I1002 11:08:10.531336 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-nrg4w"] Oct 02 11:08:10 crc kubenswrapper[4835]: I1002 11:08:10.835415 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-nrg4w" event={"ID":"6d38f1a8-cd4c-4e77-905b-0480f95167d6","Type":"ContainerStarted","Data":"4b9da243cc5f74fbc7ead10754a3fef78e2eaa5582c957a034a67d3f7edb8987"} Oct 02 11:08:11 crc kubenswrapper[4835]: I1002 11:08:11.774161 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:11 crc kubenswrapper[4835]: I1002 11:08:11.774690 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:11 crc kubenswrapper[4835]: I1002 11:08:11.816383 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:11 crc kubenswrapper[4835]: I1002 11:08:11.910858 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:13 crc kubenswrapper[4835]: I1002 11:08:13.865208 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-nrg4w" event={"ID":"6d38f1a8-cd4c-4e77-905b-0480f95167d6","Type":"ContainerStarted","Data":"e9dc99f9b0cc1e5c89dfa80681d71296c31abc4320ee32a59ccaec9767d6cb90"} Oct 02 11:08:13 crc kubenswrapper[4835]: I1002 11:08:13.884830 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-nrg4w" podStartSLOduration=2.615434079 podStartE2EDuration="4.884803074s" podCreationTimestamp="2025-10-02 11:08:09 +0000 UTC" firstStartedPulling="2025-10-02 11:08:10.550484825 +0000 UTC m=+767.110392406" lastFinishedPulling="2025-10-02 11:08:12.81985382 +0000 UTC m=+769.379761401" observedRunningTime="2025-10-02 11:08:13.88192608 +0000 UTC m=+770.441833681" watchObservedRunningTime="2025-10-02 11:08:13.884803074 +0000 UTC m=+770.444710695" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.226648 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h2c9t"] Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.226961 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h2c9t" podUID="a1410144-8ec7-453e-a307-c1b03d17056f" containerName="registry-server" containerID="cri-o://f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172" gracePeriod=2 Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.791922 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.871417 4835 generic.go:334] "Generic (PLEG): container finished" podID="a1410144-8ec7-453e-a307-c1b03d17056f" containerID="f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172" exitCode=0 Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.871486 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2c9t" event={"ID":"a1410144-8ec7-453e-a307-c1b03d17056f","Type":"ContainerDied","Data":"f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172"} Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.871505 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h2c9t" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.871558 4835 scope.go:117] "RemoveContainer" containerID="f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.871541 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2c9t" event={"ID":"a1410144-8ec7-453e-a307-c1b03d17056f","Type":"ContainerDied","Data":"86ea7877a17cecacdb48f188818c7f131ccf77e4da900bee596798bfbaee3b28"} Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.885263 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjws8\" (UniqueName: \"kubernetes.io/projected/a1410144-8ec7-453e-a307-c1b03d17056f-kube-api-access-gjws8\") pod \"a1410144-8ec7-453e-a307-c1b03d17056f\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.885327 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-catalog-content\") pod \"a1410144-8ec7-453e-a307-c1b03d17056f\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.885387 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-utilities\") pod \"a1410144-8ec7-453e-a307-c1b03d17056f\" (UID: \"a1410144-8ec7-453e-a307-c1b03d17056f\") " Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.886299 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-utilities" (OuterVolumeSpecName: "utilities") pod "a1410144-8ec7-453e-a307-c1b03d17056f" (UID: "a1410144-8ec7-453e-a307-c1b03d17056f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.886444 4835 scope.go:117] "RemoveContainer" containerID="1e7867341ccc9602467ab491818a8cbe8c2e2ebee1e29f1d9c755971493922f5" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.891321 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1410144-8ec7-453e-a307-c1b03d17056f-kube-api-access-gjws8" (OuterVolumeSpecName: "kube-api-access-gjws8") pod "a1410144-8ec7-453e-a307-c1b03d17056f" (UID: "a1410144-8ec7-453e-a307-c1b03d17056f"). InnerVolumeSpecName "kube-api-access-gjws8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.932725 4835 scope.go:117] "RemoveContainer" containerID="a208642666dd6333813d4cd19b7c875919ceba7adb0e6e30ebde1b6aa3341a6b" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.949994 4835 scope.go:117] "RemoveContainer" containerID="f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172" Oct 02 11:08:14 crc kubenswrapper[4835]: E1002 11:08:14.950565 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172\": container with ID starting with f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172 not found: ID does not exist" containerID="f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.950628 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172"} err="failed to get container status \"f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172\": rpc error: code = NotFound desc = could not find container \"f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172\": container with ID starting with f4b19a39c8d54e51a23083d47d28549421020e476b9329aea84fb9166f83f172 not found: ID does not exist" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.950668 4835 scope.go:117] "RemoveContainer" containerID="1e7867341ccc9602467ab491818a8cbe8c2e2ebee1e29f1d9c755971493922f5" Oct 02 11:08:14 crc kubenswrapper[4835]: E1002 11:08:14.951156 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7867341ccc9602467ab491818a8cbe8c2e2ebee1e29f1d9c755971493922f5\": container with ID starting with 1e7867341ccc9602467ab491818a8cbe8c2e2ebee1e29f1d9c755971493922f5 not found: ID does not exist" containerID="1e7867341ccc9602467ab491818a8cbe8c2e2ebee1e29f1d9c755971493922f5" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.951184 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7867341ccc9602467ab491818a8cbe8c2e2ebee1e29f1d9c755971493922f5"} err="failed to get container status \"1e7867341ccc9602467ab491818a8cbe8c2e2ebee1e29f1d9c755971493922f5\": rpc error: code = NotFound desc = could not find container \"1e7867341ccc9602467ab491818a8cbe8c2e2ebee1e29f1d9c755971493922f5\": container with ID starting with 1e7867341ccc9602467ab491818a8cbe8c2e2ebee1e29f1d9c755971493922f5 not found: ID does not exist" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.951203 4835 scope.go:117] "RemoveContainer" containerID="a208642666dd6333813d4cd19b7c875919ceba7adb0e6e30ebde1b6aa3341a6b" Oct 02 11:08:14 crc kubenswrapper[4835]: E1002 11:08:14.951554 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a208642666dd6333813d4cd19b7c875919ceba7adb0e6e30ebde1b6aa3341a6b\": container with ID starting with a208642666dd6333813d4cd19b7c875919ceba7adb0e6e30ebde1b6aa3341a6b not found: ID does not exist" containerID="a208642666dd6333813d4cd19b7c875919ceba7adb0e6e30ebde1b6aa3341a6b" Oct 02 11:08:14 crc kubenswrapper[4835]: I1002 11:08:14.951581 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a208642666dd6333813d4cd19b7c875919ceba7adb0e6e30ebde1b6aa3341a6b"} err="failed to get container status \"a208642666dd6333813d4cd19b7c875919ceba7adb0e6e30ebde1b6aa3341a6b\": rpc error: code = NotFound desc = could not find container \"a208642666dd6333813d4cd19b7c875919ceba7adb0e6e30ebde1b6aa3341a6b\": container with ID starting with a208642666dd6333813d4cd19b7c875919ceba7adb0e6e30ebde1b6aa3341a6b not found: ID does not exist" Oct 02 11:08:15 crc kubenswrapper[4835]: I1002 11:08:15.009569 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjws8\" (UniqueName: \"kubernetes.io/projected/a1410144-8ec7-453e-a307-c1b03d17056f-kube-api-access-gjws8\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:15 crc kubenswrapper[4835]: I1002 11:08:15.009619 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:15 crc kubenswrapper[4835]: I1002 11:08:15.374288 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1410144-8ec7-453e-a307-c1b03d17056f" (UID: "a1410144-8ec7-453e-a307-c1b03d17056f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:08:15 crc kubenswrapper[4835]: I1002 11:08:15.414685 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1410144-8ec7-453e-a307-c1b03d17056f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:15 crc kubenswrapper[4835]: I1002 11:08:15.502887 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h2c9t"] Oct 02 11:08:15 crc kubenswrapper[4835]: I1002 11:08:15.507166 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h2c9t"] Oct 02 11:08:16 crc kubenswrapper[4835]: I1002 11:08:16.259973 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1410144-8ec7-453e-a307-c1b03d17056f" path="/var/lib/kubelet/pods/a1410144-8ec7-453e-a307-c1b03d17056f/volumes" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.608025 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts"] Oct 02 11:08:19 crc kubenswrapper[4835]: E1002 11:08:19.609111 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1410144-8ec7-453e-a307-c1b03d17056f" containerName="extract-utilities" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.609212 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1410144-8ec7-453e-a307-c1b03d17056f" containerName="extract-utilities" Oct 02 11:08:19 crc kubenswrapper[4835]: E1002 11:08:19.609321 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1410144-8ec7-453e-a307-c1b03d17056f" containerName="registry-server" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.609395 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1410144-8ec7-453e-a307-c1b03d17056f" containerName="registry-server" Oct 02 11:08:19 crc kubenswrapper[4835]: E1002 11:08:19.609476 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1410144-8ec7-453e-a307-c1b03d17056f" containerName="extract-content" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.609554 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1410144-8ec7-453e-a307-c1b03d17056f" containerName="extract-content" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.609775 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1410144-8ec7-453e-a307-c1b03d17056f" containerName="registry-server" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.610664 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.615293 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-hwvjd" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.622535 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts"] Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.634208 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx"] Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.635430 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" Oct 02 11:08:19 crc kubenswrapper[4835]: W1002 11:08:19.637274 4835 reflector.go:561] object-"openshift-nmstate"/"openshift-nmstate-webhook": failed to list *v1.Secret: secrets "openshift-nmstate-webhook" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Oct 02 11:08:19 crc kubenswrapper[4835]: E1002 11:08:19.637319 4835 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"openshift-nmstate-webhook\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-nmstate-webhook\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.650982 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-tpjx2"] Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.651894 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.662569 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx"] Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.671983 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv8n7\" (UniqueName: \"kubernetes.io/projected/d5a0e56a-5f7e-4342-9571-3edcc1135a44-kube-api-access-rv8n7\") pod \"nmstate-metrics-fdff9cb8d-r88ts\" (UID: \"d5a0e56a-5f7e-4342-9571-3edcc1135a44\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.672039 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjlt\" (UniqueName: \"kubernetes.io/projected/b384ef09-a91b-41f1-9cc5-35146749d375-kube-api-access-lqjlt\") pod \"nmstate-webhook-6cdbc54649-qgcrx\" (UID: \"b384ef09-a91b-41f1-9cc5-35146749d375\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.672275 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b384ef09-a91b-41f1-9cc5-35146749d375-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-qgcrx\" (UID: \"b384ef09-a91b-41f1-9cc5-35146749d375\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.766550 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl"] Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.769310 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.778773 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.779245 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.779984 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b384ef09-a91b-41f1-9cc5-35146749d375-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-qgcrx\" (UID: \"b384ef09-a91b-41f1-9cc5-35146749d375\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.780349 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2322eea6-e397-4f14-ab1c-960128541d48-ovs-socket\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.780455 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2322eea6-e397-4f14-ab1c-960128541d48-nmstate-lock\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.780540 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv8n7\" (UniqueName: \"kubernetes.io/projected/d5a0e56a-5f7e-4342-9571-3edcc1135a44-kube-api-access-rv8n7\") pod \"nmstate-metrics-fdff9cb8d-r88ts\" (UID: \"d5a0e56a-5f7e-4342-9571-3edcc1135a44\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.780636 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjlt\" (UniqueName: \"kubernetes.io/projected/b384ef09-a91b-41f1-9cc5-35146749d375-kube-api-access-lqjlt\") pod \"nmstate-webhook-6cdbc54649-qgcrx\" (UID: \"b384ef09-a91b-41f1-9cc5-35146749d375\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.780926 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2322eea6-e397-4f14-ab1c-960128541d48-dbus-socket\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.781005 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mf98\" (UniqueName: \"kubernetes.io/projected/2322eea6-e397-4f14-ab1c-960128541d48-kube-api-access-6mf98\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.783631 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hn6xb" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.808511 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl"] Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.810543 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv8n7\" (UniqueName: \"kubernetes.io/projected/d5a0e56a-5f7e-4342-9571-3edcc1135a44-kube-api-access-rv8n7\") pod \"nmstate-metrics-fdff9cb8d-r88ts\" (UID: \"d5a0e56a-5f7e-4342-9571-3edcc1135a44\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.817127 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjlt\" (UniqueName: \"kubernetes.io/projected/b384ef09-a91b-41f1-9cc5-35146749d375-kube-api-access-lqjlt\") pod \"nmstate-webhook-6cdbc54649-qgcrx\" (UID: \"b384ef09-a91b-41f1-9cc5-35146749d375\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.887019 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5a76ca7e-9356-424e-82ed-d184275f398e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-5skfl\" (UID: \"5a76ca7e-9356-424e-82ed-d184275f398e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.887374 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a76ca7e-9356-424e-82ed-d184275f398e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-5skfl\" (UID: \"5a76ca7e-9356-424e-82ed-d184275f398e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.887470 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2gr8\" (UniqueName: \"kubernetes.io/projected/5a76ca7e-9356-424e-82ed-d184275f398e-kube-api-access-c2gr8\") pod \"nmstate-console-plugin-6b874cbd85-5skfl\" (UID: \"5a76ca7e-9356-424e-82ed-d184275f398e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.887561 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2322eea6-e397-4f14-ab1c-960128541d48-ovs-socket\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.887643 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2322eea6-e397-4f14-ab1c-960128541d48-nmstate-lock\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.887697 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2322eea6-e397-4f14-ab1c-960128541d48-ovs-socket\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.887699 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2322eea6-e397-4f14-ab1c-960128541d48-nmstate-lock\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.887719 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2322eea6-e397-4f14-ab1c-960128541d48-dbus-socket\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.887820 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mf98\" (UniqueName: \"kubernetes.io/projected/2322eea6-e397-4f14-ab1c-960128541d48-kube-api-access-6mf98\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.889740 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2322eea6-e397-4f14-ab1c-960128541d48-dbus-socket\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.909944 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mf98\" (UniqueName: \"kubernetes.io/projected/2322eea6-e397-4f14-ab1c-960128541d48-kube-api-access-6mf98\") pod \"nmstate-handler-tpjx2\" (UID: \"2322eea6-e397-4f14-ab1c-960128541d48\") " pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.931628 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.958188 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-9dc6869c9-g7v5k"] Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.959702 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.988002 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.988805 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5a76ca7e-9356-424e-82ed-d184275f398e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-5skfl\" (UID: \"5a76ca7e-9356-424e-82ed-d184275f398e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.988905 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a76ca7e-9356-424e-82ed-d184275f398e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-5skfl\" (UID: \"5a76ca7e-9356-424e-82ed-d184275f398e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.988944 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2gr8\" (UniqueName: \"kubernetes.io/projected/5a76ca7e-9356-424e-82ed-d184275f398e-kube-api-access-c2gr8\") pod \"nmstate-console-plugin-6b874cbd85-5skfl\" (UID: \"5a76ca7e-9356-424e-82ed-d184275f398e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:19 crc kubenswrapper[4835]: E1002 11:08:19.989112 4835 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 02 11:08:19 crc kubenswrapper[4835]: E1002 11:08:19.989192 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a76ca7e-9356-424e-82ed-d184275f398e-plugin-serving-cert podName:5a76ca7e-9356-424e-82ed-d184275f398e nodeName:}" failed. No retries permitted until 2025-10-02 11:08:20.489165016 +0000 UTC m=+777.049072597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/5a76ca7e-9356-424e-82ed-d184275f398e-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-5skfl" (UID: "5a76ca7e-9356-424e-82ed-d184275f398e") : secret "plugin-serving-cert" not found Oct 02 11:08:19 crc kubenswrapper[4835]: I1002 11:08:19.990153 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5a76ca7e-9356-424e-82ed-d184275f398e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-5skfl\" (UID: \"5a76ca7e-9356-424e-82ed-d184275f398e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:20 crc kubenswrapper[4835]: W1002 11:08:20.009204 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2322eea6_e397_4f14_ab1c_960128541d48.slice/crio-fdc564f5c60e4e9ad073f5d3978ad250c82e9e5815e21a81d3f8575b7ff02f52 WatchSource:0}: Error finding container fdc564f5c60e4e9ad073f5d3978ad250c82e9e5815e21a81d3f8575b7ff02f52: Status 404 returned error can't find the container with id fdc564f5c60e4e9ad073f5d3978ad250c82e9e5815e21a81d3f8575b7ff02f52 Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.028652 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2gr8\" (UniqueName: \"kubernetes.io/projected/5a76ca7e-9356-424e-82ed-d184275f398e-kube-api-access-c2gr8\") pod \"nmstate-console-plugin-6b874cbd85-5skfl\" (UID: \"5a76ca7e-9356-424e-82ed-d184275f398e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.033993 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9dc6869c9-g7v5k"] Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.090124 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-oauth-serving-cert\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.090521 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-trusted-ca-bundle\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.090557 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-console-config\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.090579 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a004da7a-a01e-46f8-953f-89160aa20579-console-oauth-config\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.090602 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a004da7a-a01e-46f8-953f-89160aa20579-console-serving-cert\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.090715 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-service-ca\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.090746 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t9d8\" (UniqueName: \"kubernetes.io/projected/a004da7a-a01e-46f8-953f-89160aa20579-kube-api-access-9t9d8\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.191743 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-oauth-serving-cert\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.191804 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-trusted-ca-bundle\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.191832 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-console-config\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.191853 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a004da7a-a01e-46f8-953f-89160aa20579-console-oauth-config\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.191876 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a004da7a-a01e-46f8-953f-89160aa20579-console-serving-cert\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.191911 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-service-ca\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.191930 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t9d8\" (UniqueName: \"kubernetes.io/projected/a004da7a-a01e-46f8-953f-89160aa20579-kube-api-access-9t9d8\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.193165 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-trusted-ca-bundle\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.193734 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-console-config\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.193984 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-oauth-serving-cert\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.194933 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a004da7a-a01e-46f8-953f-89160aa20579-service-ca\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.197428 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a004da7a-a01e-46f8-953f-89160aa20579-console-serving-cert\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.202769 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a004da7a-a01e-46f8-953f-89160aa20579-console-oauth-config\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.206428 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t9d8\" (UniqueName: \"kubernetes.io/projected/a004da7a-a01e-46f8-953f-89160aa20579-kube-api-access-9t9d8\") pod \"console-9dc6869c9-g7v5k\" (UID: \"a004da7a-a01e-46f8-953f-89160aa20579\") " pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.310290 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.366045 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts"] Oct 02 11:08:20 crc kubenswrapper[4835]: W1002 11:08:20.373277 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a0e56a_5f7e_4342_9571_3edcc1135a44.slice/crio-ba2fcb780c996ee06c91963449d34daab83af95e0649621d6233991ef51068c8 WatchSource:0}: Error finding container ba2fcb780c996ee06c91963449d34daab83af95e0649621d6233991ef51068c8: Status 404 returned error can't find the container with id ba2fcb780c996ee06c91963449d34daab83af95e0649621d6233991ef51068c8 Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.497193 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a76ca7e-9356-424e-82ed-d184275f398e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-5skfl\" (UID: \"5a76ca7e-9356-424e-82ed-d184275f398e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.501001 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a76ca7e-9356-424e-82ed-d184275f398e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-5skfl\" (UID: \"5a76ca7e-9356-424e-82ed-d184275f398e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.695703 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.712071 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9dc6869c9-g7v5k"] Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.717998 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 02 11:08:20 crc kubenswrapper[4835]: W1002 11:08:20.719205 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda004da7a_a01e_46f8_953f_89160aa20579.slice/crio-fdb51811fb54cee92b9f2bc2a5bc0d9cf98fe9b456fc938846756c5cf66c3c3f WatchSource:0}: Error finding container fdb51811fb54cee92b9f2bc2a5bc0d9cf98fe9b456fc938846756c5cf66c3c3f: Status 404 returned error can't find the container with id fdb51811fb54cee92b9f2bc2a5bc0d9cf98fe9b456fc938846756c5cf66c3c3f Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.727142 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b384ef09-a91b-41f1-9cc5-35146749d375-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-qgcrx\" (UID: \"b384ef09-a91b-41f1-9cc5-35146749d375\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.852406 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.908105 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9dc6869c9-g7v5k" event={"ID":"a004da7a-a01e-46f8-953f-89160aa20579","Type":"ContainerStarted","Data":"fdb51811fb54cee92b9f2bc2a5bc0d9cf98fe9b456fc938846756c5cf66c3c3f"} Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.909447 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tpjx2" event={"ID":"2322eea6-e397-4f14-ab1c-960128541d48","Type":"ContainerStarted","Data":"fdc564f5c60e4e9ad073f5d3978ad250c82e9e5815e21a81d3f8575b7ff02f52"} Oct 02 11:08:20 crc kubenswrapper[4835]: I1002 11:08:20.910153 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts" event={"ID":"d5a0e56a-5f7e-4342-9571-3edcc1135a44","Type":"ContainerStarted","Data":"ba2fcb780c996ee06c91963449d34daab83af95e0649621d6233991ef51068c8"} Oct 02 11:08:21 crc kubenswrapper[4835]: I1002 11:08:21.140733 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl"] Oct 02 11:08:21 crc kubenswrapper[4835]: W1002 11:08:21.148348 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a76ca7e_9356_424e_82ed_d184275f398e.slice/crio-6e5abe6b7fd00d0ea873c364eed097760fcc03c0a0a998ffc1ae26b43fb89c40 WatchSource:0}: Error finding container 6e5abe6b7fd00d0ea873c364eed097760fcc03c0a0a998ffc1ae26b43fb89c40: Status 404 returned error can't find the container with id 6e5abe6b7fd00d0ea873c364eed097760fcc03c0a0a998ffc1ae26b43fb89c40 Oct 02 11:08:21 crc kubenswrapper[4835]: I1002 11:08:21.257397 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx"] Oct 02 11:08:21 crc kubenswrapper[4835]: I1002 11:08:21.919816 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9dc6869c9-g7v5k" event={"ID":"a004da7a-a01e-46f8-953f-89160aa20579","Type":"ContainerStarted","Data":"ddb18ef44f9ea964f2cfd8a3e4f7e8772ae92e0246fb63376120284f7e45023f"} Oct 02 11:08:21 crc kubenswrapper[4835]: I1002 11:08:21.923044 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" event={"ID":"b384ef09-a91b-41f1-9cc5-35146749d375","Type":"ContainerStarted","Data":"cce72d77d2e9ce57bc5351a18735c439d21612f2a1d507b4c3ab6a2331919c45"} Oct 02 11:08:21 crc kubenswrapper[4835]: I1002 11:08:21.931015 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" event={"ID":"5a76ca7e-9356-424e-82ed-d184275f398e","Type":"ContainerStarted","Data":"6e5abe6b7fd00d0ea873c364eed097760fcc03c0a0a998ffc1ae26b43fb89c40"} Oct 02 11:08:21 crc kubenswrapper[4835]: I1002 11:08:21.943077 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9dc6869c9-g7v5k" podStartSLOduration=2.943053344 podStartE2EDuration="2.943053344s" podCreationTimestamp="2025-10-02 11:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:08:21.941358275 +0000 UTC m=+778.501265856" watchObservedRunningTime="2025-10-02 11:08:21.943053344 +0000 UTC m=+778.502960935" Oct 02 11:08:22 crc kubenswrapper[4835]: I1002 11:08:22.941867 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts" event={"ID":"d5a0e56a-5f7e-4342-9571-3edcc1135a44","Type":"ContainerStarted","Data":"e3814e62e34408b2afb38dd1049451f0384732d739dcc325dfc19fcab8e2cfe6"} Oct 02 11:08:22 crc kubenswrapper[4835]: I1002 11:08:22.943378 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" event={"ID":"b384ef09-a91b-41f1-9cc5-35146749d375","Type":"ContainerStarted","Data":"9e761dea7198c5b44979546740052aec5a41dec76152a8efb321aace590b9a1f"} Oct 02 11:08:22 crc kubenswrapper[4835]: I1002 11:08:22.943435 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" Oct 02 11:08:22 crc kubenswrapper[4835]: I1002 11:08:22.973398 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" podStartSLOduration=2.522620146 podStartE2EDuration="3.97336029s" podCreationTimestamp="2025-10-02 11:08:19 +0000 UTC" firstStartedPulling="2025-10-02 11:08:21.269900831 +0000 UTC m=+777.829808422" lastFinishedPulling="2025-10-02 11:08:22.720640985 +0000 UTC m=+779.280548566" observedRunningTime="2025-10-02 11:08:22.964919354 +0000 UTC m=+779.524826945" watchObservedRunningTime="2025-10-02 11:08:22.97336029 +0000 UTC m=+779.533267881" Oct 02 11:08:23 crc kubenswrapper[4835]: I1002 11:08:23.957232 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tpjx2" event={"ID":"2322eea6-e397-4f14-ab1c-960128541d48","Type":"ContainerStarted","Data":"32a5e570f07f33196bebb9f6ddac2544b609317062d1c451ce24ce96819e1793"} Oct 02 11:08:23 crc kubenswrapper[4835]: I1002 11:08:23.978860 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-tpjx2" podStartSLOduration=2.274806514 podStartE2EDuration="4.978840751s" podCreationTimestamp="2025-10-02 11:08:19 +0000 UTC" firstStartedPulling="2025-10-02 11:08:20.015366212 +0000 UTC m=+776.575273793" lastFinishedPulling="2025-10-02 11:08:22.719400449 +0000 UTC m=+779.279308030" observedRunningTime="2025-10-02 11:08:23.975427532 +0000 UTC m=+780.535335133" watchObservedRunningTime="2025-10-02 11:08:23.978840751 +0000 UTC m=+780.538748332" Oct 02 11:08:24 crc kubenswrapper[4835]: I1002 11:08:24.983455 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" event={"ID":"5a76ca7e-9356-424e-82ed-d184275f398e","Type":"ContainerStarted","Data":"fc25ceafd67ffd2b033a70dd09b77413d6742d4399001d58e9ca3e71c4a7e034"} Oct 02 11:08:24 crc kubenswrapper[4835]: I1002 11:08:24.983764 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:25 crc kubenswrapper[4835]: I1002 11:08:25.009532 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5skfl" podStartSLOduration=3.2607336350000002 podStartE2EDuration="6.009508658s" podCreationTimestamp="2025-10-02 11:08:19 +0000 UTC" firstStartedPulling="2025-10-02 11:08:21.150073077 +0000 UTC m=+777.709980658" lastFinishedPulling="2025-10-02 11:08:23.89884809 +0000 UTC m=+780.458755681" observedRunningTime="2025-10-02 11:08:25.001507764 +0000 UTC m=+781.561415355" watchObservedRunningTime="2025-10-02 11:08:25.009508658 +0000 UTC m=+781.569416239" Oct 02 11:08:25 crc kubenswrapper[4835]: I1002 11:08:25.992495 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts" event={"ID":"d5a0e56a-5f7e-4342-9571-3edcc1135a44","Type":"ContainerStarted","Data":"aa671a210f53edf15a0d5d926addfdb8dda2ffa8d53208ccffe385a74beaa638"} Oct 02 11:08:26 crc kubenswrapper[4835]: I1002 11:08:26.017384 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-r88ts" podStartSLOduration=2.159041318 podStartE2EDuration="7.017344848s" podCreationTimestamp="2025-10-02 11:08:19 +0000 UTC" firstStartedPulling="2025-10-02 11:08:20.375430442 +0000 UTC m=+776.935338023" lastFinishedPulling="2025-10-02 11:08:25.233733972 +0000 UTC m=+781.793641553" observedRunningTime="2025-10-02 11:08:26.011094486 +0000 UTC m=+782.571002097" watchObservedRunningTime="2025-10-02 11:08:26.017344848 +0000 UTC m=+782.577252509" Oct 02 11:08:30 crc kubenswrapper[4835]: I1002 11:08:30.020851 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-tpjx2" Oct 02 11:08:30 crc kubenswrapper[4835]: I1002 11:08:30.310619 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:30 crc kubenswrapper[4835]: I1002 11:08:30.310713 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:30 crc kubenswrapper[4835]: I1002 11:08:30.318661 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:31 crc kubenswrapper[4835]: I1002 11:08:31.030648 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9dc6869c9-g7v5k" Oct 02 11:08:31 crc kubenswrapper[4835]: I1002 11:08:31.099839 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rzxmz"] Oct 02 11:08:40 crc kubenswrapper[4835]: I1002 11:08:40.862314 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-qgcrx" Oct 02 11:08:41 crc kubenswrapper[4835]: I1002 11:08:41.984716 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:08:41 crc kubenswrapper[4835]: I1002 11:08:41.984798 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.606671 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6"] Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.609693 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6"] Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.609823 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.619420 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.775205 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.775282 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.775375 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqfd\" (UniqueName: \"kubernetes.io/projected/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-kube-api-access-ckqfd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.876497 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.876587 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.876738 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqfd\" (UniqueName: \"kubernetes.io/projected/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-kube-api-access-ckqfd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.876897 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.877265 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.898685 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqfd\" (UniqueName: \"kubernetes.io/projected/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-kube-api-access-ckqfd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:08:54 crc kubenswrapper[4835]: I1002 11:08:54.935983 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:08:55 crc kubenswrapper[4835]: I1002 11:08:55.389302 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6"] Oct 02 11:08:55 crc kubenswrapper[4835]: W1002 11:08:55.395681 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e68d9dc_7ae9_44e2_be9e_88a18450e2db.slice/crio-812c3c3dc1746ea38ce28f53f76b0befb7e9bd5eda02488aa51819a6d87df094 WatchSource:0}: Error finding container 812c3c3dc1746ea38ce28f53f76b0befb7e9bd5eda02488aa51819a6d87df094: Status 404 returned error can't find the container with id 812c3c3dc1746ea38ce28f53f76b0befb7e9bd5eda02488aa51819a6d87df094 Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.141059 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rzxmz" podUID="9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" containerName="console" containerID="cri-o://11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb" gracePeriod=15 Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.207923 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e68d9dc-7ae9-44e2-be9e-88a18450e2db" containerID="b417d033d171af545c1a1965ba1ab6c34515d16207d6cce3e112f08b55070e86" exitCode=0 Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.207979 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" event={"ID":"2e68d9dc-7ae9-44e2-be9e-88a18450e2db","Type":"ContainerDied","Data":"b417d033d171af545c1a1965ba1ab6c34515d16207d6cce3e112f08b55070e86"} Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.208036 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" event={"ID":"2e68d9dc-7ae9-44e2-be9e-88a18450e2db","Type":"ContainerStarted","Data":"812c3c3dc1746ea38ce28f53f76b0befb7e9bd5eda02488aa51819a6d87df094"} Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.538063 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rzxmz_9f6e2da8-85f9-479d-ab37-fc8bc136ceb0/console/0.log" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.538155 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.608488 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-serving-cert\") pod \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.608605 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-oauth-config\") pod \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.608643 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khf59\" (UniqueName: \"kubernetes.io/projected/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-kube-api-access-khf59\") pod \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.608674 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-service-ca\") pod \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.608699 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-config\") pod \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.608725 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-trusted-ca-bundle\") pod \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.608754 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-oauth-serving-cert\") pod \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\" (UID: \"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0\") " Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.609553 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" (UID: "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.609586 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-service-ca" (OuterVolumeSpecName: "service-ca") pod "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" (UID: "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.609623 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-config" (OuterVolumeSpecName: "console-config") pod "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" (UID: "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.609806 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" (UID: "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.614640 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-kube-api-access-khf59" (OuterVolumeSpecName: "kube-api-access-khf59") pod "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" (UID: "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0"). InnerVolumeSpecName "kube-api-access-khf59". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.615403 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" (UID: "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.615838 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" (UID: "9f6e2da8-85f9-479d-ab37-fc8bc136ceb0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.710829 4835 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.710884 4835 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.710897 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khf59\" (UniqueName: \"kubernetes.io/projected/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-kube-api-access-khf59\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.710915 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.710926 4835 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.710938 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:56 crc kubenswrapper[4835]: I1002 11:08:56.710952 4835 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:08:57 crc kubenswrapper[4835]: I1002 11:08:57.220062 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rzxmz_9f6e2da8-85f9-479d-ab37-fc8bc136ceb0/console/0.log" Oct 02 11:08:57 crc kubenswrapper[4835]: I1002 11:08:57.220519 4835 generic.go:334] "Generic (PLEG): container finished" podID="9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" containerID="11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb" exitCode=2 Oct 02 11:08:57 crc kubenswrapper[4835]: I1002 11:08:57.220570 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rzxmz" event={"ID":"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0","Type":"ContainerDied","Data":"11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb"} Oct 02 11:08:57 crc kubenswrapper[4835]: I1002 11:08:57.220616 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rzxmz" event={"ID":"9f6e2da8-85f9-479d-ab37-fc8bc136ceb0","Type":"ContainerDied","Data":"5e5c620ec42765cebe7c6a975a2f3625089389c75b7439f16e5f37ecc7e02271"} Oct 02 11:08:57 crc kubenswrapper[4835]: I1002 11:08:57.220617 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rzxmz" Oct 02 11:08:57 crc kubenswrapper[4835]: I1002 11:08:57.220742 4835 scope.go:117] "RemoveContainer" containerID="11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb" Oct 02 11:08:57 crc kubenswrapper[4835]: I1002 11:08:57.245572 4835 scope.go:117] "RemoveContainer" containerID="11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb" Oct 02 11:08:57 crc kubenswrapper[4835]: E1002 11:08:57.246363 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb\": container with ID starting with 11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb not found: ID does not exist" containerID="11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb" Oct 02 11:08:57 crc kubenswrapper[4835]: I1002 11:08:57.246407 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb"} err="failed to get container status \"11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb\": rpc error: code = NotFound desc = could not find container \"11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb\": container with ID starting with 11d5930615eb85f210579b21b1d447868743c04698b737b2beffcb229f0324fb not found: ID does not exist" Oct 02 11:08:57 crc kubenswrapper[4835]: I1002 11:08:57.278867 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rzxmz"] Oct 02 11:08:57 crc kubenswrapper[4835]: I1002 11:08:57.289717 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rzxmz"] Oct 02 11:08:58 crc kubenswrapper[4835]: I1002 11:08:58.238125 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e68d9dc-7ae9-44e2-be9e-88a18450e2db" containerID="46fb93a61f73361ab4951110670dc339dd5e9b9874ea3db0ad079a40d43cc7b5" exitCode=0 Oct 02 11:08:58 crc kubenswrapper[4835]: I1002 11:08:58.238213 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" event={"ID":"2e68d9dc-7ae9-44e2-be9e-88a18450e2db","Type":"ContainerDied","Data":"46fb93a61f73361ab4951110670dc339dd5e9b9874ea3db0ad079a40d43cc7b5"} Oct 02 11:08:58 crc kubenswrapper[4835]: I1002 11:08:58.262922 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" path="/var/lib/kubelet/pods/9f6e2da8-85f9-479d-ab37-fc8bc136ceb0/volumes" Oct 02 11:08:59 crc kubenswrapper[4835]: I1002 11:08:59.257476 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e68d9dc-7ae9-44e2-be9e-88a18450e2db" containerID="d12624d356f64bd1c1bd4d953c47beafbeb6c4d7feb07428524da76ad0c450c9" exitCode=0 Oct 02 11:08:59 crc kubenswrapper[4835]: I1002 11:08:59.257552 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" event={"ID":"2e68d9dc-7ae9-44e2-be9e-88a18450e2db","Type":"ContainerDied","Data":"d12624d356f64bd1c1bd4d953c47beafbeb6c4d7feb07428524da76ad0c450c9"} Oct 02 11:09:00 crc kubenswrapper[4835]: I1002 11:09:00.575804 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:09:00 crc kubenswrapper[4835]: I1002 11:09:00.764743 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckqfd\" (UniqueName: \"kubernetes.io/projected/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-kube-api-access-ckqfd\") pod \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " Oct 02 11:09:00 crc kubenswrapper[4835]: I1002 11:09:00.764886 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-bundle\") pod \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " Oct 02 11:09:00 crc kubenswrapper[4835]: I1002 11:09:00.764931 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-util\") pod \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\" (UID: \"2e68d9dc-7ae9-44e2-be9e-88a18450e2db\") " Oct 02 11:09:00 crc kubenswrapper[4835]: I1002 11:09:00.766395 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-bundle" (OuterVolumeSpecName: "bundle") pod "2e68d9dc-7ae9-44e2-be9e-88a18450e2db" (UID: "2e68d9dc-7ae9-44e2-be9e-88a18450e2db"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:09:00 crc kubenswrapper[4835]: I1002 11:09:00.771597 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-kube-api-access-ckqfd" (OuterVolumeSpecName: "kube-api-access-ckqfd") pod "2e68d9dc-7ae9-44e2-be9e-88a18450e2db" (UID: "2e68d9dc-7ae9-44e2-be9e-88a18450e2db"). InnerVolumeSpecName "kube-api-access-ckqfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:09:00 crc kubenswrapper[4835]: I1002 11:09:00.866427 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckqfd\" (UniqueName: \"kubernetes.io/projected/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-kube-api-access-ckqfd\") on node \"crc\" DevicePath \"\"" Oct 02 11:09:00 crc kubenswrapper[4835]: I1002 11:09:00.866820 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:09:01 crc kubenswrapper[4835]: I1002 11:09:01.164472 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-util" (OuterVolumeSpecName: "util") pod "2e68d9dc-7ae9-44e2-be9e-88a18450e2db" (UID: "2e68d9dc-7ae9-44e2-be9e-88a18450e2db"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:09:01 crc kubenswrapper[4835]: I1002 11:09:01.172721 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e68d9dc-7ae9-44e2-be9e-88a18450e2db-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:09:01 crc kubenswrapper[4835]: I1002 11:09:01.276762 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" event={"ID":"2e68d9dc-7ae9-44e2-be9e-88a18450e2db","Type":"ContainerDied","Data":"812c3c3dc1746ea38ce28f53f76b0befb7e9bd5eda02488aa51819a6d87df094"} Oct 02 11:09:01 crc kubenswrapper[4835]: I1002 11:09:01.276827 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="812c3c3dc1746ea38ce28f53f76b0befb7e9bd5eda02488aa51819a6d87df094" Oct 02 11:09:01 crc kubenswrapper[4835]: I1002 11:09:01.276869 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.970516 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh"] Oct 02 11:09:09 crc kubenswrapper[4835]: E1002 11:09:09.971680 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e68d9dc-7ae9-44e2-be9e-88a18450e2db" containerName="extract" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.971697 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e68d9dc-7ae9-44e2-be9e-88a18450e2db" containerName="extract" Oct 02 11:09:09 crc kubenswrapper[4835]: E1002 11:09:09.971712 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e68d9dc-7ae9-44e2-be9e-88a18450e2db" containerName="pull" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.971719 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e68d9dc-7ae9-44e2-be9e-88a18450e2db" containerName="pull" Oct 02 11:09:09 crc kubenswrapper[4835]: E1002 11:09:09.971730 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" containerName="console" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.971741 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" containerName="console" Oct 02 11:09:09 crc kubenswrapper[4835]: E1002 11:09:09.971751 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e68d9dc-7ae9-44e2-be9e-88a18450e2db" containerName="util" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.971757 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e68d9dc-7ae9-44e2-be9e-88a18450e2db" containerName="util" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.971862 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6e2da8-85f9-479d-ab37-fc8bc136ceb0" containerName="console" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.971874 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e68d9dc-7ae9-44e2-be9e-88a18450e2db" containerName="extract" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.972433 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.975200 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.975534 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.975677 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.975801 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 02 11:09:09 crc kubenswrapper[4835]: I1002 11:09:09.976918 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fkqz7" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.051394 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da4a91c1-ddf2-466a-99fd-dcc2be9dcb19-webhook-cert\") pod \"metallb-operator-controller-manager-6cc456f76c-4fqdh\" (UID: \"da4a91c1-ddf2-466a-99fd-dcc2be9dcb19\") " pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.051518 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb9k6\" (UniqueName: \"kubernetes.io/projected/da4a91c1-ddf2-466a-99fd-dcc2be9dcb19-kube-api-access-rb9k6\") pod \"metallb-operator-controller-manager-6cc456f76c-4fqdh\" (UID: \"da4a91c1-ddf2-466a-99fd-dcc2be9dcb19\") " pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.051557 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da4a91c1-ddf2-466a-99fd-dcc2be9dcb19-apiservice-cert\") pod \"metallb-operator-controller-manager-6cc456f76c-4fqdh\" (UID: \"da4a91c1-ddf2-466a-99fd-dcc2be9dcb19\") " pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.102477 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh"] Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.152882 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb9k6\" (UniqueName: \"kubernetes.io/projected/da4a91c1-ddf2-466a-99fd-dcc2be9dcb19-kube-api-access-rb9k6\") pod \"metallb-operator-controller-manager-6cc456f76c-4fqdh\" (UID: \"da4a91c1-ddf2-466a-99fd-dcc2be9dcb19\") " pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.152944 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da4a91c1-ddf2-466a-99fd-dcc2be9dcb19-apiservice-cert\") pod \"metallb-operator-controller-manager-6cc456f76c-4fqdh\" (UID: \"da4a91c1-ddf2-466a-99fd-dcc2be9dcb19\") " pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.152995 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da4a91c1-ddf2-466a-99fd-dcc2be9dcb19-webhook-cert\") pod \"metallb-operator-controller-manager-6cc456f76c-4fqdh\" (UID: \"da4a91c1-ddf2-466a-99fd-dcc2be9dcb19\") " pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.162544 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da4a91c1-ddf2-466a-99fd-dcc2be9dcb19-apiservice-cert\") pod \"metallb-operator-controller-manager-6cc456f76c-4fqdh\" (UID: \"da4a91c1-ddf2-466a-99fd-dcc2be9dcb19\") " pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.162544 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da4a91c1-ddf2-466a-99fd-dcc2be9dcb19-webhook-cert\") pod \"metallb-operator-controller-manager-6cc456f76c-4fqdh\" (UID: \"da4a91c1-ddf2-466a-99fd-dcc2be9dcb19\") " pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.200273 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb9k6\" (UniqueName: \"kubernetes.io/projected/da4a91c1-ddf2-466a-99fd-dcc2be9dcb19-kube-api-access-rb9k6\") pod \"metallb-operator-controller-manager-6cc456f76c-4fqdh\" (UID: \"da4a91c1-ddf2-466a-99fd-dcc2be9dcb19\") " pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.292582 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.446267 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm"] Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.447749 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.453601 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.453827 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.459378 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xq6j8" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.465766 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm"] Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.570594 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce54775f-de78-454a-b9f0-6e17daec8861-apiservice-cert\") pod \"metallb-operator-webhook-server-645c758f95-4lwwm\" (UID: \"ce54775f-de78-454a-b9f0-6e17daec8861\") " pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.570675 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sszhl\" (UniqueName: \"kubernetes.io/projected/ce54775f-de78-454a-b9f0-6e17daec8861-kube-api-access-sszhl\") pod \"metallb-operator-webhook-server-645c758f95-4lwwm\" (UID: \"ce54775f-de78-454a-b9f0-6e17daec8861\") " pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.570740 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce54775f-de78-454a-b9f0-6e17daec8861-webhook-cert\") pod \"metallb-operator-webhook-server-645c758f95-4lwwm\" (UID: \"ce54775f-de78-454a-b9f0-6e17daec8861\") " pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.672582 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sszhl\" (UniqueName: \"kubernetes.io/projected/ce54775f-de78-454a-b9f0-6e17daec8861-kube-api-access-sszhl\") pod \"metallb-operator-webhook-server-645c758f95-4lwwm\" (UID: \"ce54775f-de78-454a-b9f0-6e17daec8861\") " pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.672690 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce54775f-de78-454a-b9f0-6e17daec8861-webhook-cert\") pod \"metallb-operator-webhook-server-645c758f95-4lwwm\" (UID: \"ce54775f-de78-454a-b9f0-6e17daec8861\") " pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.672753 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce54775f-de78-454a-b9f0-6e17daec8861-apiservice-cert\") pod \"metallb-operator-webhook-server-645c758f95-4lwwm\" (UID: \"ce54775f-de78-454a-b9f0-6e17daec8861\") " pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.677363 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce54775f-de78-454a-b9f0-6e17daec8861-apiservice-cert\") pod \"metallb-operator-webhook-server-645c758f95-4lwwm\" (UID: \"ce54775f-de78-454a-b9f0-6e17daec8861\") " pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.678727 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce54775f-de78-454a-b9f0-6e17daec8861-webhook-cert\") pod \"metallb-operator-webhook-server-645c758f95-4lwwm\" (UID: \"ce54775f-de78-454a-b9f0-6e17daec8861\") " pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.692124 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sszhl\" (UniqueName: \"kubernetes.io/projected/ce54775f-de78-454a-b9f0-6e17daec8861-kube-api-access-sszhl\") pod \"metallb-operator-webhook-server-645c758f95-4lwwm\" (UID: \"ce54775f-de78-454a-b9f0-6e17daec8861\") " pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.789694 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh"] Oct 02 11:09:10 crc kubenswrapper[4835]: I1002 11:09:10.799295 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:10 crc kubenswrapper[4835]: W1002 11:09:10.814395 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda4a91c1_ddf2_466a_99fd_dcc2be9dcb19.slice/crio-5f267d900cff9a45a535f3559e7a9c97441ef4dbd451cca1915c65fe949b2927 WatchSource:0}: Error finding container 5f267d900cff9a45a535f3559e7a9c97441ef4dbd451cca1915c65fe949b2927: Status 404 returned error can't find the container with id 5f267d900cff9a45a535f3559e7a9c97441ef4dbd451cca1915c65fe949b2927 Oct 02 11:09:11 crc kubenswrapper[4835]: I1002 11:09:11.242869 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm"] Oct 02 11:09:11 crc kubenswrapper[4835]: I1002 11:09:11.346467 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" event={"ID":"da4a91c1-ddf2-466a-99fd-dcc2be9dcb19","Type":"ContainerStarted","Data":"5f267d900cff9a45a535f3559e7a9c97441ef4dbd451cca1915c65fe949b2927"} Oct 02 11:09:11 crc kubenswrapper[4835]: I1002 11:09:11.348505 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" event={"ID":"ce54775f-de78-454a-b9f0-6e17daec8861","Type":"ContainerStarted","Data":"3a946904d4ea387649b33cc53ce82c18da8066c878e63dea9c280468bcc69f56"} Oct 02 11:09:11 crc kubenswrapper[4835]: I1002 11:09:11.984253 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:09:11 crc kubenswrapper[4835]: I1002 11:09:11.984317 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:09:17 crc kubenswrapper[4835]: I1002 11:09:17.404175 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" event={"ID":"da4a91c1-ddf2-466a-99fd-dcc2be9dcb19","Type":"ContainerStarted","Data":"858be388301078ea2c3f918210d8673377b59752f9f23285478ad7b663840e73"} Oct 02 11:09:17 crc kubenswrapper[4835]: I1002 11:09:17.405576 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:17 crc kubenswrapper[4835]: I1002 11:09:17.407416 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" event={"ID":"ce54775f-de78-454a-b9f0-6e17daec8861","Type":"ContainerStarted","Data":"9bf04ccf8932250f43a9def20172bcb201a4bc0cc38f4f43530ddeef74d91f8d"} Oct 02 11:09:17 crc kubenswrapper[4835]: I1002 11:09:17.408674 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:17 crc kubenswrapper[4835]: I1002 11:09:17.424405 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" podStartSLOduration=2.860349084 podStartE2EDuration="8.424379123s" podCreationTimestamp="2025-10-02 11:09:09 +0000 UTC" firstStartedPulling="2025-10-02 11:09:10.820963695 +0000 UTC m=+827.380871276" lastFinishedPulling="2025-10-02 11:09:16.384993734 +0000 UTC m=+832.944901315" observedRunningTime="2025-10-02 11:09:17.423997172 +0000 UTC m=+833.983904773" watchObservedRunningTime="2025-10-02 11:09:17.424379123 +0000 UTC m=+833.984286694" Oct 02 11:09:17 crc kubenswrapper[4835]: I1002 11:09:17.451109 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" podStartSLOduration=2.302698085 podStartE2EDuration="7.451088472s" podCreationTimestamp="2025-10-02 11:09:10 +0000 UTC" firstStartedPulling="2025-10-02 11:09:11.254337804 +0000 UTC m=+827.814245395" lastFinishedPulling="2025-10-02 11:09:16.402728201 +0000 UTC m=+832.962635782" observedRunningTime="2025-10-02 11:09:17.449434434 +0000 UTC m=+834.009342025" watchObservedRunningTime="2025-10-02 11:09:17.451088472 +0000 UTC m=+834.010996053" Oct 02 11:09:30 crc kubenswrapper[4835]: I1002 11:09:30.804336 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-645c758f95-4lwwm" Oct 02 11:09:41 crc kubenswrapper[4835]: I1002 11:09:41.984510 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:09:41 crc kubenswrapper[4835]: I1002 11:09:41.985089 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:09:41 crc kubenswrapper[4835]: I1002 11:09:41.985155 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:09:41 crc kubenswrapper[4835]: I1002 11:09:41.986034 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9482e972b371878a44442b6006c3a59a025e351c1ec2a25e635bbcea7c81f32b"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:09:41 crc kubenswrapper[4835]: I1002 11:09:41.986128 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://9482e972b371878a44442b6006c3a59a025e351c1ec2a25e635bbcea7c81f32b" gracePeriod=600 Oct 02 11:09:42 crc kubenswrapper[4835]: I1002 11:09:42.599249 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="9482e972b371878a44442b6006c3a59a025e351c1ec2a25e635bbcea7c81f32b" exitCode=0 Oct 02 11:09:42 crc kubenswrapper[4835]: I1002 11:09:42.599299 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"9482e972b371878a44442b6006c3a59a025e351c1ec2a25e635bbcea7c81f32b"} Oct 02 11:09:42 crc kubenswrapper[4835]: I1002 11:09:42.599363 4835 scope.go:117] "RemoveContainer" containerID="d053440f4d28c5e876a916d9ffabfb83edb8e183c19537340d638e68ba6f5bab" Oct 02 11:09:43 crc kubenswrapper[4835]: I1002 11:09:43.610122 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"2d0ce126b4147f93bbd26e8b66aa4ae542cae2002f8fd305bd26ccf1276aba52"} Oct 02 11:09:50 crc kubenswrapper[4835]: I1002 11:09:50.296082 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6cc456f76c-4fqdh" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.025842 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-slt97"] Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.026883 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.028253 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hdp2h"] Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.029669 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.029778 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-8k7h9" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.030565 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.033197 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.038885 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.081930 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-slt97"] Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.088429 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-frr-sockets\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.088514 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/487c5efa-50e7-4182-a79b-b3848a8e1bd4-cert\") pod \"frr-k8s-webhook-server-64bf5d555-slt97\" (UID: \"487c5efa-50e7-4182-a79b-b3848a8e1bd4\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.088548 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-metrics\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.088574 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-frr-conf\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.088611 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-reloader\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.088631 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-metrics-certs\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.088660 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5smqg\" (UniqueName: \"kubernetes.io/projected/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-kube-api-access-5smqg\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.088686 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-frr-startup\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.088729 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt8md\" (UniqueName: \"kubernetes.io/projected/487c5efa-50e7-4182-a79b-b3848a8e1bd4-kube-api-access-zt8md\") pod \"frr-k8s-webhook-server-64bf5d555-slt97\" (UID: \"487c5efa-50e7-4182-a79b-b3848a8e1bd4\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.142255 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-str4z"] Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.143452 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.146579 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.146776 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-mb65w" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.148013 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.151689 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.190368 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-metrics-certs\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.190420 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-reloader\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.190441 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-metrics-certs\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.190508 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k29nn\" (UniqueName: \"kubernetes.io/projected/37346ff3-98f3-4dfc-b677-1f14e4b5a506-kube-api-access-k29nn\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: E1002 11:09:51.190545 4835 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.190576 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5smqg\" (UniqueName: \"kubernetes.io/projected/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-kube-api-access-5smqg\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: E1002 11:09:51.190604 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-metrics-certs podName:89a37e75-2e26-4eac-bd07-061b2cb7f0a9 nodeName:}" failed. No retries permitted until 2025-10-02 11:09:51.690583026 +0000 UTC m=+868.250490597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-metrics-certs") pod "frr-k8s-hdp2h" (UID: "89a37e75-2e26-4eac-bd07-061b2cb7f0a9") : secret "frr-k8s-certs-secret" not found Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.190622 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-frr-startup\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.190677 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/37346ff3-98f3-4dfc-b677-1f14e4b5a506-metallb-excludel2\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.190698 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-memberlist\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.190789 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt8md\" (UniqueName: \"kubernetes.io/projected/487c5efa-50e7-4182-a79b-b3848a8e1bd4-kube-api-access-zt8md\") pod \"frr-k8s-webhook-server-64bf5d555-slt97\" (UID: \"487c5efa-50e7-4182-a79b-b3848a8e1bd4\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.190848 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-frr-sockets\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.190945 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/487c5efa-50e7-4182-a79b-b3848a8e1bd4-cert\") pod \"frr-k8s-webhook-server-64bf5d555-slt97\" (UID: \"487c5efa-50e7-4182-a79b-b3848a8e1bd4\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.191007 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-metrics\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.191058 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-frr-conf\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: E1002 11:09:51.191158 4835 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 02 11:09:51 crc kubenswrapper[4835]: E1002 11:09:51.191281 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487c5efa-50e7-4182-a79b-b3848a8e1bd4-cert podName:487c5efa-50e7-4182-a79b-b3848a8e1bd4 nodeName:}" failed. No retries permitted until 2025-10-02 11:09:51.691255725 +0000 UTC m=+868.251163486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/487c5efa-50e7-4182-a79b-b3848a8e1bd4-cert") pod "frr-k8s-webhook-server-64bf5d555-slt97" (UID: "487c5efa-50e7-4182-a79b-b3848a8e1bd4") : secret "frr-k8s-webhook-server-cert" not found Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.191614 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-reloader\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.191766 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-frr-sockets\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.192186 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-frr-startup\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.192426 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-metrics\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.192539 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-frr-conf\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.213944 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5smqg\" (UniqueName: \"kubernetes.io/projected/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-kube-api-access-5smqg\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.218886 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-gmhcv"] Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.220112 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.224182 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.228080 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt8md\" (UniqueName: \"kubernetes.io/projected/487c5efa-50e7-4182-a79b-b3848a8e1bd4-kube-api-access-zt8md\") pod \"frr-k8s-webhook-server-64bf5d555-slt97\" (UID: \"487c5efa-50e7-4182-a79b-b3848a8e1bd4\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.292044 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/37346ff3-98f3-4dfc-b677-1f14e4b5a506-metallb-excludel2\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.292110 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-memberlist\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.292197 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f54912d4-2f6d-4c0d-947a-7f89783a8708-metrics-certs\") pod \"controller-68d546b9d8-gmhcv\" (UID: \"f54912d4-2f6d-4c0d-947a-7f89783a8708\") " pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.292253 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-metrics-certs\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.292277 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f54912d4-2f6d-4c0d-947a-7f89783a8708-cert\") pod \"controller-68d546b9d8-gmhcv\" (UID: \"f54912d4-2f6d-4c0d-947a-7f89783a8708\") " pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.292301 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzbxh\" (UniqueName: \"kubernetes.io/projected/f54912d4-2f6d-4c0d-947a-7f89783a8708-kube-api-access-kzbxh\") pod \"controller-68d546b9d8-gmhcv\" (UID: \"f54912d4-2f6d-4c0d-947a-7f89783a8708\") " pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.292334 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k29nn\" (UniqueName: \"kubernetes.io/projected/37346ff3-98f3-4dfc-b677-1f14e4b5a506-kube-api-access-k29nn\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: E1002 11:09:51.293367 4835 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 02 11:09:51 crc kubenswrapper[4835]: E1002 11:09:51.293443 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-metrics-certs podName:37346ff3-98f3-4dfc-b677-1f14e4b5a506 nodeName:}" failed. No retries permitted until 2025-10-02 11:09:51.793422441 +0000 UTC m=+868.353330022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-metrics-certs") pod "speaker-str4z" (UID: "37346ff3-98f3-4dfc-b677-1f14e4b5a506") : secret "speaker-certs-secret" not found Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.293472 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/37346ff3-98f3-4dfc-b677-1f14e4b5a506-metallb-excludel2\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: E1002 11:09:51.293570 4835 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 11:09:51 crc kubenswrapper[4835]: E1002 11:09:51.293614 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-memberlist podName:37346ff3-98f3-4dfc-b677-1f14e4b5a506 nodeName:}" failed. No retries permitted until 2025-10-02 11:09:51.793599126 +0000 UTC m=+868.353506907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-memberlist") pod "speaker-str4z" (UID: "37346ff3-98f3-4dfc-b677-1f14e4b5a506") : secret "metallb-memberlist" not found Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.312646 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-gmhcv"] Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.316769 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k29nn\" (UniqueName: \"kubernetes.io/projected/37346ff3-98f3-4dfc-b677-1f14e4b5a506-kube-api-access-k29nn\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.393340 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f54912d4-2f6d-4c0d-947a-7f89783a8708-metrics-certs\") pod \"controller-68d546b9d8-gmhcv\" (UID: \"f54912d4-2f6d-4c0d-947a-7f89783a8708\") " pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.393760 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f54912d4-2f6d-4c0d-947a-7f89783a8708-cert\") pod \"controller-68d546b9d8-gmhcv\" (UID: \"f54912d4-2f6d-4c0d-947a-7f89783a8708\") " pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.393785 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzbxh\" (UniqueName: \"kubernetes.io/projected/f54912d4-2f6d-4c0d-947a-7f89783a8708-kube-api-access-kzbxh\") pod \"controller-68d546b9d8-gmhcv\" (UID: \"f54912d4-2f6d-4c0d-947a-7f89783a8708\") " pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.395574 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.400331 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f54912d4-2f6d-4c0d-947a-7f89783a8708-metrics-certs\") pod \"controller-68d546b9d8-gmhcv\" (UID: \"f54912d4-2f6d-4c0d-947a-7f89783a8708\") " pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.407853 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f54912d4-2f6d-4c0d-947a-7f89783a8708-cert\") pod \"controller-68d546b9d8-gmhcv\" (UID: \"f54912d4-2f6d-4c0d-947a-7f89783a8708\") " pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.409496 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzbxh\" (UniqueName: \"kubernetes.io/projected/f54912d4-2f6d-4c0d-947a-7f89783a8708-kube-api-access-kzbxh\") pod \"controller-68d546b9d8-gmhcv\" (UID: \"f54912d4-2f6d-4c0d-947a-7f89783a8708\") " pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.577073 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.697509 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-metrics-certs\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.698967 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/487c5efa-50e7-4182-a79b-b3848a8e1bd4-cert\") pod \"frr-k8s-webhook-server-64bf5d555-slt97\" (UID: \"487c5efa-50e7-4182-a79b-b3848a8e1bd4\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.705798 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/487c5efa-50e7-4182-a79b-b3848a8e1bd4-cert\") pod \"frr-k8s-webhook-server-64bf5d555-slt97\" (UID: \"487c5efa-50e7-4182-a79b-b3848a8e1bd4\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.705946 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89a37e75-2e26-4eac-bd07-061b2cb7f0a9-metrics-certs\") pod \"frr-k8s-hdp2h\" (UID: \"89a37e75-2e26-4eac-bd07-061b2cb7f0a9\") " pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.801436 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-metrics-certs\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.803002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-memberlist\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: E1002 11:09:51.803291 4835 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 11:09:51 crc kubenswrapper[4835]: E1002 11:09:51.803406 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-memberlist podName:37346ff3-98f3-4dfc-b677-1f14e4b5a506 nodeName:}" failed. No retries permitted until 2025-10-02 11:09:52.803378927 +0000 UTC m=+869.363286538 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-memberlist") pod "speaker-str4z" (UID: "37346ff3-98f3-4dfc-b677-1f14e4b5a506") : secret "metallb-memberlist" not found Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.806529 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-metrics-certs\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.948645 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.958425 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:09:51 crc kubenswrapper[4835]: I1002 11:09:51.993597 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-gmhcv"] Oct 02 11:09:52 crc kubenswrapper[4835]: I1002 11:09:52.208801 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-slt97"] Oct 02 11:09:52 crc kubenswrapper[4835]: W1002 11:09:52.218855 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod487c5efa_50e7_4182_a79b_b3848a8e1bd4.slice/crio-a71bd160ff02080aea947ac111aface3ec0e735c545c6bf15e18fdabbd21bee4 WatchSource:0}: Error finding container a71bd160ff02080aea947ac111aface3ec0e735c545c6bf15e18fdabbd21bee4: Status 404 returned error can't find the container with id a71bd160ff02080aea947ac111aface3ec0e735c545c6bf15e18fdabbd21bee4 Oct 02 11:09:52 crc kubenswrapper[4835]: I1002 11:09:52.669686 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" event={"ID":"487c5efa-50e7-4182-a79b-b3848a8e1bd4","Type":"ContainerStarted","Data":"a71bd160ff02080aea947ac111aface3ec0e735c545c6bf15e18fdabbd21bee4"} Oct 02 11:09:52 crc kubenswrapper[4835]: I1002 11:09:52.674419 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hdp2h" event={"ID":"89a37e75-2e26-4eac-bd07-061b2cb7f0a9","Type":"ContainerStarted","Data":"16490941225f4b87005a93391aed9db5cdc118244ce6018120a59483abfb089f"} Oct 02 11:09:52 crc kubenswrapper[4835]: I1002 11:09:52.677126 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-gmhcv" event={"ID":"f54912d4-2f6d-4c0d-947a-7f89783a8708","Type":"ContainerStarted","Data":"f504477b0cd265e6f8df58a8038f6b97203ae81bcd45c5c46f5556329cb5747e"} Oct 02 11:09:52 crc kubenswrapper[4835]: I1002 11:09:52.677184 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-gmhcv" event={"ID":"f54912d4-2f6d-4c0d-947a-7f89783a8708","Type":"ContainerStarted","Data":"2597e8e67efa4a11b52d4a0debbdd4c04c4438f4af8fa6839b3ff49955cae2df"} Oct 02 11:09:52 crc kubenswrapper[4835]: I1002 11:09:52.677198 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-gmhcv" event={"ID":"f54912d4-2f6d-4c0d-947a-7f89783a8708","Type":"ContainerStarted","Data":"ba4e169be3afad2aa9653bb8f0da76cb6adc4bed7e9331e9f2fb3fadd51bb696"} Oct 02 11:09:52 crc kubenswrapper[4835]: I1002 11:09:52.677264 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:09:52 crc kubenswrapper[4835]: I1002 11:09:52.695147 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-gmhcv" podStartSLOduration=1.695123873 podStartE2EDuration="1.695123873s" podCreationTimestamp="2025-10-02 11:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:09:52.694560847 +0000 UTC m=+869.254468438" watchObservedRunningTime="2025-10-02 11:09:52.695123873 +0000 UTC m=+869.255031454" Oct 02 11:09:52 crc kubenswrapper[4835]: I1002 11:09:52.818195 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-memberlist\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:52 crc kubenswrapper[4835]: I1002 11:09:52.825456 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/37346ff3-98f3-4dfc-b677-1f14e4b5a506-memberlist\") pod \"speaker-str4z\" (UID: \"37346ff3-98f3-4dfc-b677-1f14e4b5a506\") " pod="metallb-system/speaker-str4z" Oct 02 11:09:52 crc kubenswrapper[4835]: I1002 11:09:52.961339 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-str4z" Oct 02 11:09:52 crc kubenswrapper[4835]: W1002 11:09:52.981682 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37346ff3_98f3_4dfc_b677_1f14e4b5a506.slice/crio-e45cedc5723edf41c4cf23dc8dd5c54db557fa59c309afedb9bfdab58f23b0f0 WatchSource:0}: Error finding container e45cedc5723edf41c4cf23dc8dd5c54db557fa59c309afedb9bfdab58f23b0f0: Status 404 returned error can't find the container with id e45cedc5723edf41c4cf23dc8dd5c54db557fa59c309afedb9bfdab58f23b0f0 Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.412400 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f25px"] Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.413837 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f25px" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.460966 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f25px"] Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.528811 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-catalog-content\") pod \"community-operators-f25px\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " pod="openshift-marketplace/community-operators-f25px" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.528859 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx5mv\" (UniqueName: \"kubernetes.io/projected/44847688-613f-4198-97b5-196c36150dbb-kube-api-access-hx5mv\") pod \"community-operators-f25px\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " pod="openshift-marketplace/community-operators-f25px" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.528912 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-utilities\") pod \"community-operators-f25px\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " pod="openshift-marketplace/community-operators-f25px" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.630259 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-catalog-content\") pod \"community-operators-f25px\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " pod="openshift-marketplace/community-operators-f25px" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.630315 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx5mv\" (UniqueName: \"kubernetes.io/projected/44847688-613f-4198-97b5-196c36150dbb-kube-api-access-hx5mv\") pod \"community-operators-f25px\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " pod="openshift-marketplace/community-operators-f25px" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.630361 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-utilities\") pod \"community-operators-f25px\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " pod="openshift-marketplace/community-operators-f25px" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.630887 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-utilities\") pod \"community-operators-f25px\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " pod="openshift-marketplace/community-operators-f25px" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.631149 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-catalog-content\") pod \"community-operators-f25px\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " pod="openshift-marketplace/community-operators-f25px" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.697261 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx5mv\" (UniqueName: \"kubernetes.io/projected/44847688-613f-4198-97b5-196c36150dbb-kube-api-access-hx5mv\") pod \"community-operators-f25px\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " pod="openshift-marketplace/community-operators-f25px" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.720318 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-str4z" event={"ID":"37346ff3-98f3-4dfc-b677-1f14e4b5a506","Type":"ContainerStarted","Data":"edd8903128d698c640289b05d0b462c93425ff9ddd0296643546734bef505800"} Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.720385 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-str4z" event={"ID":"37346ff3-98f3-4dfc-b677-1f14e4b5a506","Type":"ContainerStarted","Data":"2e1182d81bf76c19d0bab742cc7cdd1f0acbcddc5b95067e2bf0ee79b15c1627"} Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.720400 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-str4z" event={"ID":"37346ff3-98f3-4dfc-b677-1f14e4b5a506","Type":"ContainerStarted","Data":"e45cedc5723edf41c4cf23dc8dd5c54db557fa59c309afedb9bfdab58f23b0f0"} Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.721315 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-str4z" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.732298 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f25px" Oct 02 11:09:53 crc kubenswrapper[4835]: I1002 11:09:53.754332 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-str4z" podStartSLOduration=2.7543141049999997 podStartE2EDuration="2.754314105s" podCreationTimestamp="2025-10-02 11:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:09:53.751941419 +0000 UTC m=+870.311849010" watchObservedRunningTime="2025-10-02 11:09:53.754314105 +0000 UTC m=+870.314221686" Oct 02 11:09:54 crc kubenswrapper[4835]: I1002 11:09:54.284382 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f25px"] Oct 02 11:09:54 crc kubenswrapper[4835]: I1002 11:09:54.737733 4835 generic.go:334] "Generic (PLEG): container finished" podID="44847688-613f-4198-97b5-196c36150dbb" containerID="0d47ccb4ff882145ebad749f887bd0546bced7feb25aa5ee21101bedf4c50afb" exitCode=0 Oct 02 11:09:54 crc kubenswrapper[4835]: I1002 11:09:54.738994 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f25px" event={"ID":"44847688-613f-4198-97b5-196c36150dbb","Type":"ContainerDied","Data":"0d47ccb4ff882145ebad749f887bd0546bced7feb25aa5ee21101bedf4c50afb"} Oct 02 11:09:54 crc kubenswrapper[4835]: I1002 11:09:54.739023 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f25px" event={"ID":"44847688-613f-4198-97b5-196c36150dbb","Type":"ContainerStarted","Data":"f869a13f6ca0b85d75f45587e6d40115345be812c91ab732a6cc8d5ddfa8ee43"} Oct 02 11:09:55 crc kubenswrapper[4835]: I1002 11:09:55.765347 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f25px" event={"ID":"44847688-613f-4198-97b5-196c36150dbb","Type":"ContainerStarted","Data":"9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd"} Oct 02 11:09:56 crc kubenswrapper[4835]: I1002 11:09:56.789481 4835 generic.go:334] "Generic (PLEG): container finished" podID="44847688-613f-4198-97b5-196c36150dbb" containerID="9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd" exitCode=0 Oct 02 11:09:56 crc kubenswrapper[4835]: I1002 11:09:56.789542 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f25px" event={"ID":"44847688-613f-4198-97b5-196c36150dbb","Type":"ContainerDied","Data":"9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd"} Oct 02 11:09:57 crc kubenswrapper[4835]: I1002 11:09:57.796840 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f25px" event={"ID":"44847688-613f-4198-97b5-196c36150dbb","Type":"ContainerStarted","Data":"d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac"} Oct 02 11:09:57 crc kubenswrapper[4835]: I1002 11:09:57.821168 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f25px" podStartSLOduration=2.31652607 podStartE2EDuration="4.821145181s" podCreationTimestamp="2025-10-02 11:09:53 +0000 UTC" firstStartedPulling="2025-10-02 11:09:54.740545362 +0000 UTC m=+871.300452943" lastFinishedPulling="2025-10-02 11:09:57.245164473 +0000 UTC m=+873.805072054" observedRunningTime="2025-10-02 11:09:57.814927067 +0000 UTC m=+874.374834668" watchObservedRunningTime="2025-10-02 11:09:57.821145181 +0000 UTC m=+874.381052762" Oct 02 11:10:00 crc kubenswrapper[4835]: I1002 11:10:00.818433 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" event={"ID":"487c5efa-50e7-4182-a79b-b3848a8e1bd4","Type":"ContainerStarted","Data":"bcca84cf30ad68d5ad39139a03a51314cbeffde9b11659b8d101d392737e6b60"} Oct 02 11:10:00 crc kubenswrapper[4835]: I1002 11:10:00.819362 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" Oct 02 11:10:00 crc kubenswrapper[4835]: I1002 11:10:00.820516 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hdp2h" event={"ID":"89a37e75-2e26-4eac-bd07-061b2cb7f0a9","Type":"ContainerStarted","Data":"47fd137e4c7ad9ab327affe9d4465d44eaf1c73a637f4260c663389ebe06f664"} Oct 02 11:10:00 crc kubenswrapper[4835]: I1002 11:10:00.837987 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" podStartSLOduration=1.5893194560000001 podStartE2EDuration="9.837961932s" podCreationTimestamp="2025-10-02 11:09:51 +0000 UTC" firstStartedPulling="2025-10-02 11:09:52.220984762 +0000 UTC m=+868.780892343" lastFinishedPulling="2025-10-02 11:10:00.469627238 +0000 UTC m=+877.029534819" observedRunningTime="2025-10-02 11:10:00.83574192 +0000 UTC m=+877.395649501" watchObservedRunningTime="2025-10-02 11:10:00.837961932 +0000 UTC m=+877.397869523" Oct 02 11:10:01 crc kubenswrapper[4835]: I1002 11:10:01.829079 4835 generic.go:334] "Generic (PLEG): container finished" podID="89a37e75-2e26-4eac-bd07-061b2cb7f0a9" containerID="47fd137e4c7ad9ab327affe9d4465d44eaf1c73a637f4260c663389ebe06f664" exitCode=0 Oct 02 11:10:01 crc kubenswrapper[4835]: I1002 11:10:01.829183 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hdp2h" event={"ID":"89a37e75-2e26-4eac-bd07-061b2cb7f0a9","Type":"ContainerDied","Data":"47fd137e4c7ad9ab327affe9d4465d44eaf1c73a637f4260c663389ebe06f664"} Oct 02 11:10:02 crc kubenswrapper[4835]: I1002 11:10:02.840588 4835 generic.go:334] "Generic (PLEG): container finished" podID="89a37e75-2e26-4eac-bd07-061b2cb7f0a9" containerID="e62b90ba1dc45f12e1666a530cefdb8ef149423196377fb891e6a450838ef469" exitCode=0 Oct 02 11:10:02 crc kubenswrapper[4835]: I1002 11:10:02.840655 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hdp2h" event={"ID":"89a37e75-2e26-4eac-bd07-061b2cb7f0a9","Type":"ContainerDied","Data":"e62b90ba1dc45f12e1666a530cefdb8ef149423196377fb891e6a450838ef469"} Oct 02 11:10:03 crc kubenswrapper[4835]: I1002 11:10:03.733377 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f25px" Oct 02 11:10:03 crc kubenswrapper[4835]: I1002 11:10:03.734348 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f25px" Oct 02 11:10:03 crc kubenswrapper[4835]: I1002 11:10:03.783857 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f25px" Oct 02 11:10:03 crc kubenswrapper[4835]: I1002 11:10:03.885214 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f25px" Oct 02 11:10:04 crc kubenswrapper[4835]: I1002 11:10:04.011039 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f25px"] Oct 02 11:10:04 crc kubenswrapper[4835]: I1002 11:10:04.857542 4835 generic.go:334] "Generic (PLEG): container finished" podID="89a37e75-2e26-4eac-bd07-061b2cb7f0a9" containerID="4507418e7448c7224c3a57ccdc0c33027fafb4e9e06493a651db6957b666ece6" exitCode=0 Oct 02 11:10:04 crc kubenswrapper[4835]: I1002 11:10:04.858619 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hdp2h" event={"ID":"89a37e75-2e26-4eac-bd07-061b2cb7f0a9","Type":"ContainerDied","Data":"4507418e7448c7224c3a57ccdc0c33027fafb4e9e06493a651db6957b666ece6"} Oct 02 11:10:05 crc kubenswrapper[4835]: I1002 11:10:05.880963 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hdp2h" event={"ID":"89a37e75-2e26-4eac-bd07-061b2cb7f0a9","Type":"ContainerStarted","Data":"2251312127fc4e047d7ba98063b15564882b98d5a9c16ccd37906f514b2533d0"} Oct 02 11:10:05 crc kubenswrapper[4835]: I1002 11:10:05.881362 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hdp2h" event={"ID":"89a37e75-2e26-4eac-bd07-061b2cb7f0a9","Type":"ContainerStarted","Data":"8023f27dc592e387bdb7feb5b62569052f8c837b5d6a3000caa24eb2ee7df337"} Oct 02 11:10:05 crc kubenswrapper[4835]: I1002 11:10:05.881386 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hdp2h" event={"ID":"89a37e75-2e26-4eac-bd07-061b2cb7f0a9","Type":"ContainerStarted","Data":"f92f0ca5f8740ad2c944190947dc3bb8e5c95223facdd3fa9aba28017388508c"} Oct 02 11:10:05 crc kubenswrapper[4835]: I1002 11:10:05.881398 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hdp2h" event={"ID":"89a37e75-2e26-4eac-bd07-061b2cb7f0a9","Type":"ContainerStarted","Data":"f8ae2ca69d817e34d1b4b883f62dc1caf90dc9410cca139a3ebb8775fd2bc1ab"} Oct 02 11:10:05 crc kubenswrapper[4835]: I1002 11:10:05.881412 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hdp2h" event={"ID":"89a37e75-2e26-4eac-bd07-061b2cb7f0a9","Type":"ContainerStarted","Data":"9015fa80a8611a6dbca9690f214a8f1d9bef11900e6a081df60605d14027f273"} Oct 02 11:10:05 crc kubenswrapper[4835]: I1002 11:10:05.881424 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hdp2h" event={"ID":"89a37e75-2e26-4eac-bd07-061b2cb7f0a9","Type":"ContainerStarted","Data":"1e69f349735f53062782eb71a6577afc24cd12648fb79fb2ae151763899e5cd2"} Oct 02 11:10:05 crc kubenswrapper[4835]: I1002 11:10:05.881164 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f25px" podUID="44847688-613f-4198-97b5-196c36150dbb" containerName="registry-server" containerID="cri-o://d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac" gracePeriod=2 Oct 02 11:10:05 crc kubenswrapper[4835]: I1002 11:10:05.930512 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hdp2h" podStartSLOduration=6.609753144 podStartE2EDuration="14.930484132s" podCreationTimestamp="2025-10-02 11:09:51 +0000 UTC" firstStartedPulling="2025-10-02 11:09:52.124339911 +0000 UTC m=+868.684247492" lastFinishedPulling="2025-10-02 11:10:00.445070909 +0000 UTC m=+877.004978480" observedRunningTime="2025-10-02 11:10:05.925985925 +0000 UTC m=+882.485893516" watchObservedRunningTime="2025-10-02 11:10:05.930484132 +0000 UTC m=+882.490391713" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.778462 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f25px" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.889970 4835 generic.go:334] "Generic (PLEG): container finished" podID="44847688-613f-4198-97b5-196c36150dbb" containerID="d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac" exitCode=0 Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.890059 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f25px" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.890098 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f25px" event={"ID":"44847688-613f-4198-97b5-196c36150dbb","Type":"ContainerDied","Data":"d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac"} Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.890175 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f25px" event={"ID":"44847688-613f-4198-97b5-196c36150dbb","Type":"ContainerDied","Data":"f869a13f6ca0b85d75f45587e6d40115345be812c91ab732a6cc8d5ddfa8ee43"} Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.890206 4835 scope.go:117] "RemoveContainer" containerID="d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.890941 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.909732 4835 scope.go:117] "RemoveContainer" containerID="9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.941440 4835 scope.go:117] "RemoveContainer" containerID="0d47ccb4ff882145ebad749f887bd0546bced7feb25aa5ee21101bedf4c50afb" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.959640 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.960772 4835 scope.go:117] "RemoveContainer" containerID="d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac" Oct 02 11:10:06 crc kubenswrapper[4835]: E1002 11:10:06.961444 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac\": container with ID starting with d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac not found: ID does not exist" containerID="d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.961497 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac"} err="failed to get container status \"d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac\": rpc error: code = NotFound desc = could not find container \"d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac\": container with ID starting with d22469a9e5d5afaceaa41e7e3312624f2a1aea97e3e908de53a1051b7b44feac not found: ID does not exist" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.961532 4835 scope.go:117] "RemoveContainer" containerID="9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd" Oct 02 11:10:06 crc kubenswrapper[4835]: E1002 11:10:06.962176 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd\": container with ID starting with 9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd not found: ID does not exist" containerID="9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.962206 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd"} err="failed to get container status \"9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd\": rpc error: code = NotFound desc = could not find container \"9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd\": container with ID starting with 9433c7f30bcc55e29894d06d7a47ac4ae81cdc87d54134bc88a661ccb8dc1bfd not found: ID does not exist" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.962309 4835 scope.go:117] "RemoveContainer" containerID="0d47ccb4ff882145ebad749f887bd0546bced7feb25aa5ee21101bedf4c50afb" Oct 02 11:10:06 crc kubenswrapper[4835]: E1002 11:10:06.962540 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d47ccb4ff882145ebad749f887bd0546bced7feb25aa5ee21101bedf4c50afb\": container with ID starting with 0d47ccb4ff882145ebad749f887bd0546bced7feb25aa5ee21101bedf4c50afb not found: ID does not exist" containerID="0d47ccb4ff882145ebad749f887bd0546bced7feb25aa5ee21101bedf4c50afb" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.962565 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d47ccb4ff882145ebad749f887bd0546bced7feb25aa5ee21101bedf4c50afb"} err="failed to get container status \"0d47ccb4ff882145ebad749f887bd0546bced7feb25aa5ee21101bedf4c50afb\": rpc error: code = NotFound desc = could not find container \"0d47ccb4ff882145ebad749f887bd0546bced7feb25aa5ee21101bedf4c50afb\": container with ID starting with 0d47ccb4ff882145ebad749f887bd0546bced7feb25aa5ee21101bedf4c50afb not found: ID does not exist" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.967118 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-catalog-content\") pod \"44847688-613f-4198-97b5-196c36150dbb\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.967242 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-utilities\") pod \"44847688-613f-4198-97b5-196c36150dbb\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.967277 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx5mv\" (UniqueName: \"kubernetes.io/projected/44847688-613f-4198-97b5-196c36150dbb-kube-api-access-hx5mv\") pod \"44847688-613f-4198-97b5-196c36150dbb\" (UID: \"44847688-613f-4198-97b5-196c36150dbb\") " Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.968625 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-utilities" (OuterVolumeSpecName: "utilities") pod "44847688-613f-4198-97b5-196c36150dbb" (UID: "44847688-613f-4198-97b5-196c36150dbb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:06 crc kubenswrapper[4835]: I1002 11:10:06.974976 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44847688-613f-4198-97b5-196c36150dbb-kube-api-access-hx5mv" (OuterVolumeSpecName: "kube-api-access-hx5mv") pod "44847688-613f-4198-97b5-196c36150dbb" (UID: "44847688-613f-4198-97b5-196c36150dbb"). InnerVolumeSpecName "kube-api-access-hx5mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:07 crc kubenswrapper[4835]: I1002 11:10:07.002106 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:10:07 crc kubenswrapper[4835]: I1002 11:10:07.016060 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44847688-613f-4198-97b5-196c36150dbb" (UID: "44847688-613f-4198-97b5-196c36150dbb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:07 crc kubenswrapper[4835]: I1002 11:10:07.069424 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:07 crc kubenswrapper[4835]: I1002 11:10:07.069466 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx5mv\" (UniqueName: \"kubernetes.io/projected/44847688-613f-4198-97b5-196c36150dbb-kube-api-access-hx5mv\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:07 crc kubenswrapper[4835]: I1002 11:10:07.069476 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44847688-613f-4198-97b5-196c36150dbb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:07 crc kubenswrapper[4835]: I1002 11:10:07.218163 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f25px"] Oct 02 11:10:07 crc kubenswrapper[4835]: I1002 11:10:07.221708 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f25px"] Oct 02 11:10:08 crc kubenswrapper[4835]: I1002 11:10:08.263802 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44847688-613f-4198-97b5-196c36150dbb" path="/var/lib/kubelet/pods/44847688-613f-4198-97b5-196c36150dbb/volumes" Oct 02 11:10:11 crc kubenswrapper[4835]: I1002 11:10:11.581037 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-gmhcv" Oct 02 11:10:11 crc kubenswrapper[4835]: I1002 11:10:11.959684 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-slt97" Oct 02 11:10:12 crc kubenswrapper[4835]: I1002 11:10:12.964882 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-str4z" Oct 02 11:10:15 crc kubenswrapper[4835]: I1002 11:10:15.994242 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tzvgc"] Oct 02 11:10:15 crc kubenswrapper[4835]: E1002 11:10:15.995447 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44847688-613f-4198-97b5-196c36150dbb" containerName="registry-server" Oct 02 11:10:15 crc kubenswrapper[4835]: I1002 11:10:15.995467 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="44847688-613f-4198-97b5-196c36150dbb" containerName="registry-server" Oct 02 11:10:15 crc kubenswrapper[4835]: E1002 11:10:15.995486 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44847688-613f-4198-97b5-196c36150dbb" containerName="extract-utilities" Oct 02 11:10:15 crc kubenswrapper[4835]: I1002 11:10:15.995493 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="44847688-613f-4198-97b5-196c36150dbb" containerName="extract-utilities" Oct 02 11:10:15 crc kubenswrapper[4835]: E1002 11:10:15.995503 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44847688-613f-4198-97b5-196c36150dbb" containerName="extract-content" Oct 02 11:10:15 crc kubenswrapper[4835]: I1002 11:10:15.995510 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="44847688-613f-4198-97b5-196c36150dbb" containerName="extract-content" Oct 02 11:10:15 crc kubenswrapper[4835]: I1002 11:10:15.995672 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="44847688-613f-4198-97b5-196c36150dbb" containerName="registry-server" Oct 02 11:10:15 crc kubenswrapper[4835]: I1002 11:10:15.996340 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tzvgc" Oct 02 11:10:16 crc kubenswrapper[4835]: I1002 11:10:16.002984 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-d9p5k" Oct 02 11:10:16 crc kubenswrapper[4835]: I1002 11:10:16.003001 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 02 11:10:16 crc kubenswrapper[4835]: I1002 11:10:16.006753 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 02 11:10:16 crc kubenswrapper[4835]: I1002 11:10:16.074596 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tzvgc"] Oct 02 11:10:16 crc kubenswrapper[4835]: I1002 11:10:16.101084 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms54d\" (UniqueName: \"kubernetes.io/projected/6456b274-8ee5-48dd-80ad-3749410ee9b0-kube-api-access-ms54d\") pod \"openstack-operator-index-tzvgc\" (UID: \"6456b274-8ee5-48dd-80ad-3749410ee9b0\") " pod="openstack-operators/openstack-operator-index-tzvgc" Oct 02 11:10:16 crc kubenswrapper[4835]: I1002 11:10:16.202483 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms54d\" (UniqueName: \"kubernetes.io/projected/6456b274-8ee5-48dd-80ad-3749410ee9b0-kube-api-access-ms54d\") pod \"openstack-operator-index-tzvgc\" (UID: \"6456b274-8ee5-48dd-80ad-3749410ee9b0\") " pod="openstack-operators/openstack-operator-index-tzvgc" Oct 02 11:10:16 crc kubenswrapper[4835]: I1002 11:10:16.234705 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms54d\" (UniqueName: \"kubernetes.io/projected/6456b274-8ee5-48dd-80ad-3749410ee9b0-kube-api-access-ms54d\") pod \"openstack-operator-index-tzvgc\" (UID: \"6456b274-8ee5-48dd-80ad-3749410ee9b0\") " pod="openstack-operators/openstack-operator-index-tzvgc" Oct 02 11:10:16 crc kubenswrapper[4835]: I1002 11:10:16.314019 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tzvgc" Oct 02 11:10:16 crc kubenswrapper[4835]: I1002 11:10:16.753788 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tzvgc"] Oct 02 11:10:16 crc kubenswrapper[4835]: W1002 11:10:16.759304 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6456b274_8ee5_48dd_80ad_3749410ee9b0.slice/crio-1dc1463840f347cbf91d5d752c9685f5959d972653f586c393d63c1f391e2c1f WatchSource:0}: Error finding container 1dc1463840f347cbf91d5d752c9685f5959d972653f586c393d63c1f391e2c1f: Status 404 returned error can't find the container with id 1dc1463840f347cbf91d5d752c9685f5959d972653f586c393d63c1f391e2c1f Oct 02 11:10:16 crc kubenswrapper[4835]: I1002 11:10:16.964064 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tzvgc" event={"ID":"6456b274-8ee5-48dd-80ad-3749410ee9b0","Type":"ContainerStarted","Data":"1dc1463840f347cbf91d5d752c9685f5959d972653f586c393d63c1f391e2c1f"} Oct 02 11:10:19 crc kubenswrapper[4835]: I1002 11:10:19.344523 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-tzvgc"] Oct 02 11:10:19 crc kubenswrapper[4835]: I1002 11:10:19.951782 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vf2mn"] Oct 02 11:10:19 crc kubenswrapper[4835]: I1002 11:10:19.952891 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vf2mn" Oct 02 11:10:19 crc kubenswrapper[4835]: I1002 11:10:19.966847 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vf2mn"] Oct 02 11:10:19 crc kubenswrapper[4835]: I1002 11:10:19.992008 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tzvgc" event={"ID":"6456b274-8ee5-48dd-80ad-3749410ee9b0","Type":"ContainerStarted","Data":"18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d"} Oct 02 11:10:20 crc kubenswrapper[4835]: I1002 11:10:20.016535 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tzvgc" podStartSLOduration=2.79048549 podStartE2EDuration="5.016495535s" podCreationTimestamp="2025-10-02 11:10:15 +0000 UTC" firstStartedPulling="2025-10-02 11:10:16.761674658 +0000 UTC m=+893.321582239" lastFinishedPulling="2025-10-02 11:10:18.987684673 +0000 UTC m=+895.547592284" observedRunningTime="2025-10-02 11:10:20.010348912 +0000 UTC m=+896.570256483" watchObservedRunningTime="2025-10-02 11:10:20.016495535 +0000 UTC m=+896.576403166" Oct 02 11:10:20 crc kubenswrapper[4835]: I1002 11:10:20.055327 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq8hc\" (UniqueName: \"kubernetes.io/projected/2d8f70e8-7a4e-4ff1-90a1-d53d0fdc3bd9-kube-api-access-xq8hc\") pod \"openstack-operator-index-vf2mn\" (UID: \"2d8f70e8-7a4e-4ff1-90a1-d53d0fdc3bd9\") " pod="openstack-operators/openstack-operator-index-vf2mn" Oct 02 11:10:20 crc kubenswrapper[4835]: I1002 11:10:20.157210 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq8hc\" (UniqueName: \"kubernetes.io/projected/2d8f70e8-7a4e-4ff1-90a1-d53d0fdc3bd9-kube-api-access-xq8hc\") pod \"openstack-operator-index-vf2mn\" (UID: \"2d8f70e8-7a4e-4ff1-90a1-d53d0fdc3bd9\") " pod="openstack-operators/openstack-operator-index-vf2mn" Oct 02 11:10:20 crc kubenswrapper[4835]: I1002 11:10:20.185370 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq8hc\" (UniqueName: \"kubernetes.io/projected/2d8f70e8-7a4e-4ff1-90a1-d53d0fdc3bd9-kube-api-access-xq8hc\") pod \"openstack-operator-index-vf2mn\" (UID: \"2d8f70e8-7a4e-4ff1-90a1-d53d0fdc3bd9\") " pod="openstack-operators/openstack-operator-index-vf2mn" Oct 02 11:10:20 crc kubenswrapper[4835]: I1002 11:10:20.280458 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vf2mn" Oct 02 11:10:20 crc kubenswrapper[4835]: I1002 11:10:20.721438 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vf2mn"] Oct 02 11:10:20 crc kubenswrapper[4835]: W1002 11:10:20.726984 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8f70e8_7a4e_4ff1_90a1_d53d0fdc3bd9.slice/crio-a0e510d3a2df979962bff7b86281ed42c210a5350c7ef9a95d28cc8bd442e6cc WatchSource:0}: Error finding container a0e510d3a2df979962bff7b86281ed42c210a5350c7ef9a95d28cc8bd442e6cc: Status 404 returned error can't find the container with id a0e510d3a2df979962bff7b86281ed42c210a5350c7ef9a95d28cc8bd442e6cc Oct 02 11:10:21 crc kubenswrapper[4835]: I1002 11:10:21.000231 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vf2mn" event={"ID":"2d8f70e8-7a4e-4ff1-90a1-d53d0fdc3bd9","Type":"ContainerStarted","Data":"deb02e10d7dd62cf67e89b9fdb80e8d78dbfe623f92d59e9a54399a8ccc42bc8"} Oct 02 11:10:21 crc kubenswrapper[4835]: I1002 11:10:21.000610 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vf2mn" event={"ID":"2d8f70e8-7a4e-4ff1-90a1-d53d0fdc3bd9","Type":"ContainerStarted","Data":"a0e510d3a2df979962bff7b86281ed42c210a5350c7ef9a95d28cc8bd442e6cc"} Oct 02 11:10:21 crc kubenswrapper[4835]: I1002 11:10:21.000373 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-tzvgc" podUID="6456b274-8ee5-48dd-80ad-3749410ee9b0" containerName="registry-server" containerID="cri-o://18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d" gracePeriod=2 Oct 02 11:10:21 crc kubenswrapper[4835]: I1002 11:10:21.017875 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vf2mn" podStartSLOduration=1.954564091 podStartE2EDuration="2.017851916s" podCreationTimestamp="2025-10-02 11:10:19 +0000 UTC" firstStartedPulling="2025-10-02 11:10:20.730818724 +0000 UTC m=+897.290726305" lastFinishedPulling="2025-10-02 11:10:20.794106559 +0000 UTC m=+897.354014130" observedRunningTime="2025-10-02 11:10:21.013888815 +0000 UTC m=+897.573796416" watchObservedRunningTime="2025-10-02 11:10:21.017851916 +0000 UTC m=+897.577759517" Oct 02 11:10:21 crc kubenswrapper[4835]: I1002 11:10:21.357025 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tzvgc" Oct 02 11:10:21 crc kubenswrapper[4835]: I1002 11:10:21.475888 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms54d\" (UniqueName: \"kubernetes.io/projected/6456b274-8ee5-48dd-80ad-3749410ee9b0-kube-api-access-ms54d\") pod \"6456b274-8ee5-48dd-80ad-3749410ee9b0\" (UID: \"6456b274-8ee5-48dd-80ad-3749410ee9b0\") " Oct 02 11:10:21 crc kubenswrapper[4835]: I1002 11:10:21.485064 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6456b274-8ee5-48dd-80ad-3749410ee9b0-kube-api-access-ms54d" (OuterVolumeSpecName: "kube-api-access-ms54d") pod "6456b274-8ee5-48dd-80ad-3749410ee9b0" (UID: "6456b274-8ee5-48dd-80ad-3749410ee9b0"). InnerVolumeSpecName "kube-api-access-ms54d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:21 crc kubenswrapper[4835]: I1002 11:10:21.577394 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms54d\" (UniqueName: \"kubernetes.io/projected/6456b274-8ee5-48dd-80ad-3749410ee9b0-kube-api-access-ms54d\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:21 crc kubenswrapper[4835]: I1002 11:10:21.963292 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hdp2h" Oct 02 11:10:22 crc kubenswrapper[4835]: I1002 11:10:22.008942 4835 generic.go:334] "Generic (PLEG): container finished" podID="6456b274-8ee5-48dd-80ad-3749410ee9b0" containerID="18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d" exitCode=0 Oct 02 11:10:22 crc kubenswrapper[4835]: I1002 11:10:22.009017 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tzvgc" Oct 02 11:10:22 crc kubenswrapper[4835]: I1002 11:10:22.009055 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tzvgc" event={"ID":"6456b274-8ee5-48dd-80ad-3749410ee9b0","Type":"ContainerDied","Data":"18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d"} Oct 02 11:10:22 crc kubenswrapper[4835]: I1002 11:10:22.009128 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tzvgc" event={"ID":"6456b274-8ee5-48dd-80ad-3749410ee9b0","Type":"ContainerDied","Data":"1dc1463840f347cbf91d5d752c9685f5959d972653f586c393d63c1f391e2c1f"} Oct 02 11:10:22 crc kubenswrapper[4835]: I1002 11:10:22.009153 4835 scope.go:117] "RemoveContainer" containerID="18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d" Oct 02 11:10:22 crc kubenswrapper[4835]: I1002 11:10:22.026320 4835 scope.go:117] "RemoveContainer" containerID="18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d" Oct 02 11:10:22 crc kubenswrapper[4835]: E1002 11:10:22.027253 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d\": container with ID starting with 18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d not found: ID does not exist" containerID="18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d" Oct 02 11:10:22 crc kubenswrapper[4835]: I1002 11:10:22.027319 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d"} err="failed to get container status \"18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d\": rpc error: code = NotFound desc = could not find container \"18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d\": container with ID starting with 18a045d74c02234542a4c930b140c048691f817a1534af44108f6a0f2d66140d not found: ID does not exist" Oct 02 11:10:22 crc kubenswrapper[4835]: I1002 11:10:22.038168 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-tzvgc"] Oct 02 11:10:22 crc kubenswrapper[4835]: I1002 11:10:22.041507 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-tzvgc"] Oct 02 11:10:22 crc kubenswrapper[4835]: I1002 11:10:22.264181 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6456b274-8ee5-48dd-80ad-3749410ee9b0" path="/var/lib/kubelet/pods/6456b274-8ee5-48dd-80ad-3749410ee9b0/volumes" Oct 02 11:10:29 crc kubenswrapper[4835]: I1002 11:10:29.969181 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tzkg2"] Oct 02 11:10:29 crc kubenswrapper[4835]: E1002 11:10:29.970281 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6456b274-8ee5-48dd-80ad-3749410ee9b0" containerName="registry-server" Oct 02 11:10:29 crc kubenswrapper[4835]: I1002 11:10:29.970303 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6456b274-8ee5-48dd-80ad-3749410ee9b0" containerName="registry-server" Oct 02 11:10:29 crc kubenswrapper[4835]: I1002 11:10:29.970514 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6456b274-8ee5-48dd-80ad-3749410ee9b0" containerName="registry-server" Oct 02 11:10:29 crc kubenswrapper[4835]: I1002 11:10:29.972105 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:29.978946 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzkg2"] Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.112258 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frhwq\" (UniqueName: \"kubernetes.io/projected/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-kube-api-access-frhwq\") pod \"redhat-marketplace-tzkg2\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.112396 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-catalog-content\") pod \"redhat-marketplace-tzkg2\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.112460 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-utilities\") pod \"redhat-marketplace-tzkg2\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.213681 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-catalog-content\") pod \"redhat-marketplace-tzkg2\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.213762 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-utilities\") pod \"redhat-marketplace-tzkg2\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.213841 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frhwq\" (UniqueName: \"kubernetes.io/projected/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-kube-api-access-frhwq\") pod \"redhat-marketplace-tzkg2\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.214699 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-utilities\") pod \"redhat-marketplace-tzkg2\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.214723 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-catalog-content\") pod \"redhat-marketplace-tzkg2\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.243403 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frhwq\" (UniqueName: \"kubernetes.io/projected/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-kube-api-access-frhwq\") pod \"redhat-marketplace-tzkg2\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.281607 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-vf2mn" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.281709 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-vf2mn" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.319270 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-vf2mn" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.331783 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:30 crc kubenswrapper[4835]: I1002 11:10:30.566237 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzkg2"] Oct 02 11:10:31 crc kubenswrapper[4835]: I1002 11:10:31.068470 4835 generic.go:334] "Generic (PLEG): container finished" podID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" containerID="13113d1301ad43c9248076235d193d110f9afc09b590af1eafe53f38979cdf13" exitCode=0 Oct 02 11:10:31 crc kubenswrapper[4835]: I1002 11:10:31.068610 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzkg2" event={"ID":"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc","Type":"ContainerDied","Data":"13113d1301ad43c9248076235d193d110f9afc09b590af1eafe53f38979cdf13"} Oct 02 11:10:31 crc kubenswrapper[4835]: I1002 11:10:31.068718 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzkg2" event={"ID":"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc","Type":"ContainerStarted","Data":"80c8a73192163274ba372686653e0de6fa644370b9c0b06b18df665cfdeebc9c"} Oct 02 11:10:31 crc kubenswrapper[4835]: I1002 11:10:31.107360 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-vf2mn" Oct 02 11:10:33 crc kubenswrapper[4835]: I1002 11:10:33.084905 4835 generic.go:334] "Generic (PLEG): container finished" podID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" containerID="7f405f51b848ac038dbc675fb383e93d24c555563fc6cb1abc4c2853b9aff4b2" exitCode=0 Oct 02 11:10:33 crc kubenswrapper[4835]: I1002 11:10:33.085017 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzkg2" event={"ID":"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc","Type":"ContainerDied","Data":"7f405f51b848ac038dbc675fb383e93d24c555563fc6cb1abc4c2853b9aff4b2"} Oct 02 11:10:34 crc kubenswrapper[4835]: I1002 11:10:34.097705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzkg2" event={"ID":"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc","Type":"ContainerStarted","Data":"6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d"} Oct 02 11:10:38 crc kubenswrapper[4835]: I1002 11:10:38.804839 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tzkg2" podStartSLOduration=7.240578999 podStartE2EDuration="9.804814132s" podCreationTimestamp="2025-10-02 11:10:29 +0000 UTC" firstStartedPulling="2025-10-02 11:10:31.070518421 +0000 UTC m=+907.630426002" lastFinishedPulling="2025-10-02 11:10:33.634753554 +0000 UTC m=+910.194661135" observedRunningTime="2025-10-02 11:10:34.129422161 +0000 UTC m=+910.689329752" watchObservedRunningTime="2025-10-02 11:10:38.804814132 +0000 UTC m=+915.364721713" Oct 02 11:10:38 crc kubenswrapper[4835]: I1002 11:10:38.810667 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp"] Oct 02 11:10:38 crc kubenswrapper[4835]: I1002 11:10:38.812440 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:38 crc kubenswrapper[4835]: I1002 11:10:38.815248 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2f7rk" Oct 02 11:10:38 crc kubenswrapper[4835]: I1002 11:10:38.829179 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp"] Oct 02 11:10:38 crc kubenswrapper[4835]: I1002 11:10:38.950182 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-bundle\") pod \"6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:38 crc kubenswrapper[4835]: I1002 11:10:38.950288 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-util\") pod \"6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:38 crc kubenswrapper[4835]: I1002 11:10:38.950329 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v59d6\" (UniqueName: \"kubernetes.io/projected/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-kube-api-access-v59d6\") pod \"6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:39 crc kubenswrapper[4835]: I1002 11:10:39.051922 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-bundle\") pod \"6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:39 crc kubenswrapper[4835]: I1002 11:10:39.052013 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-util\") pod \"6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:39 crc kubenswrapper[4835]: I1002 11:10:39.052068 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v59d6\" (UniqueName: \"kubernetes.io/projected/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-kube-api-access-v59d6\") pod \"6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:39 crc kubenswrapper[4835]: I1002 11:10:39.052674 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-bundle\") pod \"6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:39 crc kubenswrapper[4835]: I1002 11:10:39.052705 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-util\") pod \"6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:39 crc kubenswrapper[4835]: I1002 11:10:39.090788 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v59d6\" (UniqueName: \"kubernetes.io/projected/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-kube-api-access-v59d6\") pod \"6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:39 crc kubenswrapper[4835]: I1002 11:10:39.137348 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:39 crc kubenswrapper[4835]: I1002 11:10:39.378679 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp"] Oct 02 11:10:40 crc kubenswrapper[4835]: I1002 11:10:40.139041 4835 generic.go:334] "Generic (PLEG): container finished" podID="f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" containerID="c3480d3b2031398118dd05ffbe4d429d226e4d1f27a0016b306cbd63bdf8e80a" exitCode=0 Oct 02 11:10:40 crc kubenswrapper[4835]: I1002 11:10:40.139155 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" event={"ID":"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef","Type":"ContainerDied","Data":"c3480d3b2031398118dd05ffbe4d429d226e4d1f27a0016b306cbd63bdf8e80a"} Oct 02 11:10:40 crc kubenswrapper[4835]: I1002 11:10:40.139481 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" event={"ID":"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef","Type":"ContainerStarted","Data":"a94caac94f473d8b861be8fb7b5964a311fe7a5e94f91a7d3a78f98b77e02a24"} Oct 02 11:10:40 crc kubenswrapper[4835]: I1002 11:10:40.332248 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:40 crc kubenswrapper[4835]: I1002 11:10:40.332622 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:40 crc kubenswrapper[4835]: I1002 11:10:40.382457 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:41 crc kubenswrapper[4835]: I1002 11:10:41.147909 4835 generic.go:334] "Generic (PLEG): container finished" podID="f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" containerID="e8459242d0e8201d71c94de87ca6467d31a427d940dab3fcbc3f4c9b7da36e55" exitCode=0 Oct 02 11:10:41 crc kubenswrapper[4835]: I1002 11:10:41.147980 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" event={"ID":"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef","Type":"ContainerDied","Data":"e8459242d0e8201d71c94de87ca6467d31a427d940dab3fcbc3f4c9b7da36e55"} Oct 02 11:10:41 crc kubenswrapper[4835]: I1002 11:10:41.202362 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:42 crc kubenswrapper[4835]: I1002 11:10:42.159954 4835 generic.go:334] "Generic (PLEG): container finished" podID="f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" containerID="748c95561c0cb3021f9bd0f2ffaec2a85551f8d7bbfa7800620b138b183f7656" exitCode=0 Oct 02 11:10:42 crc kubenswrapper[4835]: I1002 11:10:42.160678 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" event={"ID":"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef","Type":"ContainerDied","Data":"748c95561c0cb3021f9bd0f2ffaec2a85551f8d7bbfa7800620b138b183f7656"} Oct 02 11:10:43 crc kubenswrapper[4835]: I1002 11:10:43.144148 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzkg2"] Oct 02 11:10:43 crc kubenswrapper[4835]: I1002 11:10:43.455705 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:43 crc kubenswrapper[4835]: I1002 11:10:43.630277 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-bundle\") pod \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " Oct 02 11:10:43 crc kubenswrapper[4835]: I1002 11:10:43.630334 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v59d6\" (UniqueName: \"kubernetes.io/projected/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-kube-api-access-v59d6\") pod \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " Oct 02 11:10:43 crc kubenswrapper[4835]: I1002 11:10:43.630443 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-util\") pod \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\" (UID: \"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef\") " Oct 02 11:10:43 crc kubenswrapper[4835]: I1002 11:10:43.631456 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-bundle" (OuterVolumeSpecName: "bundle") pod "f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" (UID: "f93ea9d4-aae6-4d12-aef2-7ecf558b4fef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:43 crc kubenswrapper[4835]: I1002 11:10:43.636952 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-kube-api-access-v59d6" (OuterVolumeSpecName: "kube-api-access-v59d6") pod "f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" (UID: "f93ea9d4-aae6-4d12-aef2-7ecf558b4fef"). InnerVolumeSpecName "kube-api-access-v59d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:43 crc kubenswrapper[4835]: I1002 11:10:43.648032 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-util" (OuterVolumeSpecName: "util") pod "f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" (UID: "f93ea9d4-aae6-4d12-aef2-7ecf558b4fef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:43 crc kubenswrapper[4835]: I1002 11:10:43.732515 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:43 crc kubenswrapper[4835]: I1002 11:10:43.732555 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:43 crc kubenswrapper[4835]: I1002 11:10:43.732569 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v59d6\" (UniqueName: \"kubernetes.io/projected/f93ea9d4-aae6-4d12-aef2-7ecf558b4fef-kube-api-access-v59d6\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.179539 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" event={"ID":"f93ea9d4-aae6-4d12-aef2-7ecf558b4fef","Type":"ContainerDied","Data":"a94caac94f473d8b861be8fb7b5964a311fe7a5e94f91a7d3a78f98b77e02a24"} Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.179607 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a94caac94f473d8b861be8fb7b5964a311fe7a5e94f91a7d3a78f98b77e02a24" Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.179572 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp" Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.179710 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tzkg2" podUID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" containerName="registry-server" containerID="cri-o://6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d" gracePeriod=2 Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.581636 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.746661 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frhwq\" (UniqueName: \"kubernetes.io/projected/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-kube-api-access-frhwq\") pod \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.746870 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-catalog-content\") pod \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.747357 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-utilities\") pod \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\" (UID: \"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc\") " Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.749088 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-utilities" (OuterVolumeSpecName: "utilities") pod "b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" (UID: "b66eb0f4-dad0-4977-94d9-1a35c3cb99fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.752117 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-kube-api-access-frhwq" (OuterVolumeSpecName: "kube-api-access-frhwq") pod "b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" (UID: "b66eb0f4-dad0-4977-94d9-1a35c3cb99fc"). InnerVolumeSpecName "kube-api-access-frhwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.769558 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" (UID: "b66eb0f4-dad0-4977-94d9-1a35c3cb99fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.849423 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.849501 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frhwq\" (UniqueName: \"kubernetes.io/projected/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-kube-api-access-frhwq\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:44 crc kubenswrapper[4835]: I1002 11:10:44.849528 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.188769 4835 generic.go:334] "Generic (PLEG): container finished" podID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" containerID="6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d" exitCode=0 Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.188826 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzkg2" Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.188821 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzkg2" event={"ID":"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc","Type":"ContainerDied","Data":"6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d"} Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.188948 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzkg2" event={"ID":"b66eb0f4-dad0-4977-94d9-1a35c3cb99fc","Type":"ContainerDied","Data":"80c8a73192163274ba372686653e0de6fa644370b9c0b06b18df665cfdeebc9c"} Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.188990 4835 scope.go:117] "RemoveContainer" containerID="6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d" Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.210690 4835 scope.go:117] "RemoveContainer" containerID="7f405f51b848ac038dbc675fb383e93d24c555563fc6cb1abc4c2853b9aff4b2" Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.218916 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzkg2"] Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.222967 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzkg2"] Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.251833 4835 scope.go:117] "RemoveContainer" containerID="13113d1301ad43c9248076235d193d110f9afc09b590af1eafe53f38979cdf13" Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.273468 4835 scope.go:117] "RemoveContainer" containerID="6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d" Oct 02 11:10:45 crc kubenswrapper[4835]: E1002 11:10:45.274004 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d\": container with ID starting with 6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d not found: ID does not exist" containerID="6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d" Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.274065 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d"} err="failed to get container status \"6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d\": rpc error: code = NotFound desc = could not find container \"6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d\": container with ID starting with 6906831c3556779ca781452ecc0a25ee43f8668d0b8244c971d2ac46b43cec7d not found: ID does not exist" Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.274101 4835 scope.go:117] "RemoveContainer" containerID="7f405f51b848ac038dbc675fb383e93d24c555563fc6cb1abc4c2853b9aff4b2" Oct 02 11:10:45 crc kubenswrapper[4835]: E1002 11:10:45.274691 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f405f51b848ac038dbc675fb383e93d24c555563fc6cb1abc4c2853b9aff4b2\": container with ID starting with 7f405f51b848ac038dbc675fb383e93d24c555563fc6cb1abc4c2853b9aff4b2 not found: ID does not exist" containerID="7f405f51b848ac038dbc675fb383e93d24c555563fc6cb1abc4c2853b9aff4b2" Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.274723 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f405f51b848ac038dbc675fb383e93d24c555563fc6cb1abc4c2853b9aff4b2"} err="failed to get container status \"7f405f51b848ac038dbc675fb383e93d24c555563fc6cb1abc4c2853b9aff4b2\": rpc error: code = NotFound desc = could not find container \"7f405f51b848ac038dbc675fb383e93d24c555563fc6cb1abc4c2853b9aff4b2\": container with ID starting with 7f405f51b848ac038dbc675fb383e93d24c555563fc6cb1abc4c2853b9aff4b2 not found: ID does not exist" Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.274762 4835 scope.go:117] "RemoveContainer" containerID="13113d1301ad43c9248076235d193d110f9afc09b590af1eafe53f38979cdf13" Oct 02 11:10:45 crc kubenswrapper[4835]: E1002 11:10:45.275414 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13113d1301ad43c9248076235d193d110f9afc09b590af1eafe53f38979cdf13\": container with ID starting with 13113d1301ad43c9248076235d193d110f9afc09b590af1eafe53f38979cdf13 not found: ID does not exist" containerID="13113d1301ad43c9248076235d193d110f9afc09b590af1eafe53f38979cdf13" Oct 02 11:10:45 crc kubenswrapper[4835]: I1002 11:10:45.275471 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13113d1301ad43c9248076235d193d110f9afc09b590af1eafe53f38979cdf13"} err="failed to get container status \"13113d1301ad43c9248076235d193d110f9afc09b590af1eafe53f38979cdf13\": rpc error: code = NotFound desc = could not find container \"13113d1301ad43c9248076235d193d110f9afc09b590af1eafe53f38979cdf13\": container with ID starting with 13113d1301ad43c9248076235d193d110f9afc09b590af1eafe53f38979cdf13 not found: ID does not exist" Oct 02 11:10:46 crc kubenswrapper[4835]: I1002 11:10:46.258486 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" path="/var/lib/kubelet/pods/b66eb0f4-dad0-4977-94d9-1a35c3cb99fc/volumes" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.569946 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7"] Oct 02 11:10:49 crc kubenswrapper[4835]: E1002 11:10:49.570583 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" containerName="registry-server" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.570598 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" containerName="registry-server" Oct 02 11:10:49 crc kubenswrapper[4835]: E1002 11:10:49.570616 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" containerName="extract" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.570624 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" containerName="extract" Oct 02 11:10:49 crc kubenswrapper[4835]: E1002 11:10:49.570640 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" containerName="extract-utilities" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.570648 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" containerName="extract-utilities" Oct 02 11:10:49 crc kubenswrapper[4835]: E1002 11:10:49.570660 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" containerName="util" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.570667 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" containerName="util" Oct 02 11:10:49 crc kubenswrapper[4835]: E1002 11:10:49.570680 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" containerName="extract-content" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.570689 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" containerName="extract-content" Oct 02 11:10:49 crc kubenswrapper[4835]: E1002 11:10:49.570719 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" containerName="pull" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.570729 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" containerName="pull" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.570872 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66eb0f4-dad0-4977-94d9-1a35c3cb99fc" containerName="registry-server" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.570888 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93ea9d4-aae6-4d12-aef2-7ecf558b4fef" containerName="extract" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.571616 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.581921 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-btmk8" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.610722 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7"] Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.758491 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9mms\" (UniqueName: \"kubernetes.io/projected/937e16a3-fce3-4b5f-8969-69995e59a465-kube-api-access-n9mms\") pod \"openstack-operator-controller-operator-db7694d5f-lbfr7\" (UID: \"937e16a3-fce3-4b5f-8969-69995e59a465\") " pod="openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.859645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9mms\" (UniqueName: \"kubernetes.io/projected/937e16a3-fce3-4b5f-8969-69995e59a465-kube-api-access-n9mms\") pod \"openstack-operator-controller-operator-db7694d5f-lbfr7\" (UID: \"937e16a3-fce3-4b5f-8969-69995e59a465\") " pod="openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.879075 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9mms\" (UniqueName: \"kubernetes.io/projected/937e16a3-fce3-4b5f-8969-69995e59a465-kube-api-access-n9mms\") pod \"openstack-operator-controller-operator-db7694d5f-lbfr7\" (UID: \"937e16a3-fce3-4b5f-8969-69995e59a465\") " pod="openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7" Oct 02 11:10:49 crc kubenswrapper[4835]: I1002 11:10:49.903529 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7" Oct 02 11:10:50 crc kubenswrapper[4835]: I1002 11:10:50.326763 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7"] Oct 02 11:10:50 crc kubenswrapper[4835]: W1002 11:10:50.339269 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod937e16a3_fce3_4b5f_8969_69995e59a465.slice/crio-a56794e22b02f45f1f00c5092c840fc21751df8ab4fe988ef448a84d4cf90840 WatchSource:0}: Error finding container a56794e22b02f45f1f00c5092c840fc21751df8ab4fe988ef448a84d4cf90840: Status 404 returned error can't find the container with id a56794e22b02f45f1f00c5092c840fc21751df8ab4fe988ef448a84d4cf90840 Oct 02 11:10:51 crc kubenswrapper[4835]: I1002 11:10:51.241187 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7" event={"ID":"937e16a3-fce3-4b5f-8969-69995e59a465","Type":"ContainerStarted","Data":"a56794e22b02f45f1f00c5092c840fc21751df8ab4fe988ef448a84d4cf90840"} Oct 02 11:10:53 crc kubenswrapper[4835]: I1002 11:10:53.784884 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bzpph"] Oct 02 11:10:53 crc kubenswrapper[4835]: I1002 11:10:53.786737 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:10:53 crc kubenswrapper[4835]: I1002 11:10:53.796921 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bzpph"] Oct 02 11:10:53 crc kubenswrapper[4835]: I1002 11:10:53.924947 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q528d\" (UniqueName: \"kubernetes.io/projected/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-kube-api-access-q528d\") pod \"certified-operators-bzpph\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:10:53 crc kubenswrapper[4835]: I1002 11:10:53.925442 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-utilities\") pod \"certified-operators-bzpph\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:10:53 crc kubenswrapper[4835]: I1002 11:10:53.925491 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-catalog-content\") pod \"certified-operators-bzpph\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:10:54 crc kubenswrapper[4835]: I1002 11:10:54.026733 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q528d\" (UniqueName: \"kubernetes.io/projected/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-kube-api-access-q528d\") pod \"certified-operators-bzpph\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:10:54 crc kubenswrapper[4835]: I1002 11:10:54.026834 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-utilities\") pod \"certified-operators-bzpph\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:10:54 crc kubenswrapper[4835]: I1002 11:10:54.026875 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-catalog-content\") pod \"certified-operators-bzpph\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:10:54 crc kubenswrapper[4835]: I1002 11:10:54.027378 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-catalog-content\") pod \"certified-operators-bzpph\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:10:54 crc kubenswrapper[4835]: I1002 11:10:54.027427 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-utilities\") pod \"certified-operators-bzpph\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:10:54 crc kubenswrapper[4835]: I1002 11:10:54.048028 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q528d\" (UniqueName: \"kubernetes.io/projected/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-kube-api-access-q528d\") pod \"certified-operators-bzpph\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:10:54 crc kubenswrapper[4835]: I1002 11:10:54.106925 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:10:54 crc kubenswrapper[4835]: I1002 11:10:54.572959 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bzpph"] Oct 02 11:10:54 crc kubenswrapper[4835]: W1002 11:10:54.574341 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e4a4de_f2fc_4267_b8b6_3b1338a13a66.slice/crio-5c3b17ebb189ff49275bbf47d1bc6187d60fa210bce135bf778760a9419e0554 WatchSource:0}: Error finding container 5c3b17ebb189ff49275bbf47d1bc6187d60fa210bce135bf778760a9419e0554: Status 404 returned error can't find the container with id 5c3b17ebb189ff49275bbf47d1bc6187d60fa210bce135bf778760a9419e0554 Oct 02 11:10:55 crc kubenswrapper[4835]: I1002 11:10:55.309304 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7" event={"ID":"937e16a3-fce3-4b5f-8969-69995e59a465","Type":"ContainerStarted","Data":"73026b3cb0f9b155de8854af1cf06615f3d7ff1e6b8eef34ef76a805d817102e"} Oct 02 11:10:55 crc kubenswrapper[4835]: I1002 11:10:55.311327 4835 generic.go:334] "Generic (PLEG): container finished" podID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" containerID="50b14daca4c400bcad721d1233c9c089522f26371ee8234c13eb082655af5050" exitCode=0 Oct 02 11:10:55 crc kubenswrapper[4835]: I1002 11:10:55.311353 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzpph" event={"ID":"69e4a4de-f2fc-4267-b8b6-3b1338a13a66","Type":"ContainerDied","Data":"50b14daca4c400bcad721d1233c9c089522f26371ee8234c13eb082655af5050"} Oct 02 11:10:55 crc kubenswrapper[4835]: I1002 11:10:55.311367 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzpph" event={"ID":"69e4a4de-f2fc-4267-b8b6-3b1338a13a66","Type":"ContainerStarted","Data":"5c3b17ebb189ff49275bbf47d1bc6187d60fa210bce135bf778760a9419e0554"} Oct 02 11:10:57 crc kubenswrapper[4835]: I1002 11:10:57.340122 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7" event={"ID":"937e16a3-fce3-4b5f-8969-69995e59a465","Type":"ContainerStarted","Data":"9517095698b75f9f44a09d1e4ef7f795f3b40c9f32d70d55f17459c4798fca52"} Oct 02 11:10:57 crc kubenswrapper[4835]: I1002 11:10:57.340730 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7" Oct 02 11:10:57 crc kubenswrapper[4835]: I1002 11:10:57.341908 4835 generic.go:334] "Generic (PLEG): container finished" podID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" containerID="d5cada5c4eed2f2aba62adb40246102a2dd5c5f47c0df46089a95895c0d83d87" exitCode=0 Oct 02 11:10:57 crc kubenswrapper[4835]: I1002 11:10:57.341963 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzpph" event={"ID":"69e4a4de-f2fc-4267-b8b6-3b1338a13a66","Type":"ContainerDied","Data":"d5cada5c4eed2f2aba62adb40246102a2dd5c5f47c0df46089a95895c0d83d87"} Oct 02 11:10:57 crc kubenswrapper[4835]: I1002 11:10:57.374537 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7" podStartSLOduration=2.475746557 podStartE2EDuration="8.374513354s" podCreationTimestamp="2025-10-02 11:10:49 +0000 UTC" firstStartedPulling="2025-10-02 11:10:50.341590041 +0000 UTC m=+926.901497622" lastFinishedPulling="2025-10-02 11:10:56.240356838 +0000 UTC m=+932.800264419" observedRunningTime="2025-10-02 11:10:57.367194389 +0000 UTC m=+933.927101980" watchObservedRunningTime="2025-10-02 11:10:57.374513354 +0000 UTC m=+933.934420935" Oct 02 11:10:58 crc kubenswrapper[4835]: I1002 11:10:58.350902 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzpph" event={"ID":"69e4a4de-f2fc-4267-b8b6-3b1338a13a66","Type":"ContainerStarted","Data":"1d8bc18251d58d749efa1a26573b3a8553873e81e38b72297989378f8e19e257"} Oct 02 11:10:58 crc kubenswrapper[4835]: I1002 11:10:58.384650 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bzpph" podStartSLOduration=2.964392775 podStartE2EDuration="5.384626759s" podCreationTimestamp="2025-10-02 11:10:53 +0000 UTC" firstStartedPulling="2025-10-02 11:10:55.425382625 +0000 UTC m=+931.985290206" lastFinishedPulling="2025-10-02 11:10:57.845616609 +0000 UTC m=+934.405524190" observedRunningTime="2025-10-02 11:10:58.379701941 +0000 UTC m=+934.939609532" watchObservedRunningTime="2025-10-02 11:10:58.384626759 +0000 UTC m=+934.944534340" Oct 02 11:10:59 crc kubenswrapper[4835]: I1002 11:10:59.906685 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-db7694d5f-lbfr7" Oct 02 11:11:04 crc kubenswrapper[4835]: I1002 11:11:04.107757 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:11:04 crc kubenswrapper[4835]: I1002 11:11:04.108514 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:11:04 crc kubenswrapper[4835]: I1002 11:11:04.151610 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:11:04 crc kubenswrapper[4835]: I1002 11:11:04.428099 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:11:06 crc kubenswrapper[4835]: I1002 11:11:06.950710 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bzpph"] Oct 02 11:11:06 crc kubenswrapper[4835]: I1002 11:11:06.951540 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bzpph" podUID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" containerName="registry-server" containerID="cri-o://1d8bc18251d58d749efa1a26573b3a8553873e81e38b72297989378f8e19e257" gracePeriod=2 Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.409327 4835 generic.go:334] "Generic (PLEG): container finished" podID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" containerID="1d8bc18251d58d749efa1a26573b3a8553873e81e38b72297989378f8e19e257" exitCode=0 Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.409662 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzpph" event={"ID":"69e4a4de-f2fc-4267-b8b6-3b1338a13a66","Type":"ContainerDied","Data":"1d8bc18251d58d749efa1a26573b3a8553873e81e38b72297989378f8e19e257"} Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.476789 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.643862 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q528d\" (UniqueName: \"kubernetes.io/projected/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-kube-api-access-q528d\") pod \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.643951 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-utilities\") pod \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.644084 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-catalog-content\") pod \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\" (UID: \"69e4a4de-f2fc-4267-b8b6-3b1338a13a66\") " Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.644908 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-utilities" (OuterVolumeSpecName: "utilities") pod "69e4a4de-f2fc-4267-b8b6-3b1338a13a66" (UID: "69e4a4de-f2fc-4267-b8b6-3b1338a13a66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.651467 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-kube-api-access-q528d" (OuterVolumeSpecName: "kube-api-access-q528d") pod "69e4a4de-f2fc-4267-b8b6-3b1338a13a66" (UID: "69e4a4de-f2fc-4267-b8b6-3b1338a13a66"). InnerVolumeSpecName "kube-api-access-q528d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.697462 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69e4a4de-f2fc-4267-b8b6-3b1338a13a66" (UID: "69e4a4de-f2fc-4267-b8b6-3b1338a13a66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.746051 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.746104 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q528d\" (UniqueName: \"kubernetes.io/projected/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-kube-api-access-q528d\") on node \"crc\" DevicePath \"\"" Oct 02 11:11:07 crc kubenswrapper[4835]: I1002 11:11:07.746123 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e4a4de-f2fc-4267-b8b6-3b1338a13a66-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:11:08 crc kubenswrapper[4835]: I1002 11:11:08.418617 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzpph" event={"ID":"69e4a4de-f2fc-4267-b8b6-3b1338a13a66","Type":"ContainerDied","Data":"5c3b17ebb189ff49275bbf47d1bc6187d60fa210bce135bf778760a9419e0554"} Oct 02 11:11:08 crc kubenswrapper[4835]: I1002 11:11:08.419008 4835 scope.go:117] "RemoveContainer" containerID="1d8bc18251d58d749efa1a26573b3a8553873e81e38b72297989378f8e19e257" Oct 02 11:11:08 crc kubenswrapper[4835]: I1002 11:11:08.419138 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzpph" Oct 02 11:11:08 crc kubenswrapper[4835]: I1002 11:11:08.439663 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bzpph"] Oct 02 11:11:08 crc kubenswrapper[4835]: I1002 11:11:08.446310 4835 scope.go:117] "RemoveContainer" containerID="d5cada5c4eed2f2aba62adb40246102a2dd5c5f47c0df46089a95895c0d83d87" Oct 02 11:11:08 crc kubenswrapper[4835]: I1002 11:11:08.449087 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bzpph"] Oct 02 11:11:08 crc kubenswrapper[4835]: I1002 11:11:08.469250 4835 scope.go:117] "RemoveContainer" containerID="50b14daca4c400bcad721d1233c9c089522f26371ee8234c13eb082655af5050" Oct 02 11:11:10 crc kubenswrapper[4835]: I1002 11:11:10.260710 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" path="/var/lib/kubelet/pods/69e4a4de-f2fc-4267-b8b6-3b1338a13a66/volumes" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.052472 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn"] Oct 02 11:11:17 crc kubenswrapper[4835]: E1002 11:11:17.053405 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" containerName="extract-utilities" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.053424 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" containerName="extract-utilities" Oct 02 11:11:17 crc kubenswrapper[4835]: E1002 11:11:17.053443 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" containerName="extract-content" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.053449 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" containerName="extract-content" Oct 02 11:11:17 crc kubenswrapper[4835]: E1002 11:11:17.053461 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" containerName="registry-server" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.053469 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" containerName="registry-server" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.053596 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e4a4de-f2fc-4267-b8b6-3b1338a13a66" containerName="registry-server" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.054392 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.056848 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zx8sv" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.067558 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.083177 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.092129 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-97c5f" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.101205 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.102666 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bkjk\" (UniqueName: \"kubernetes.io/projected/0805fa88-ea1a-4dec-b686-1024df504971-kube-api-access-6bkjk\") pod \"barbican-operator-controller-manager-6ff8b75857-n7dnn\" (UID: \"0805fa88-ea1a-4dec-b686-1024df504971\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.144318 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.155524 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.156940 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.160583 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-27rjb" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.161138 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.162494 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.167134 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-kblp9" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.172541 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.185284 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.193317 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.194854 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.200645 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.201387 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-mrfdh" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.201735 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.205726 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4mbm2" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.206271 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bkjk\" (UniqueName: \"kubernetes.io/projected/0805fa88-ea1a-4dec-b686-1024df504971-kube-api-access-6bkjk\") pod \"barbican-operator-controller-manager-6ff8b75857-n7dnn\" (UID: \"0805fa88-ea1a-4dec-b686-1024df504971\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.206398 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxzb\" (UniqueName: \"kubernetes.io/projected/e0c4310c-242a-4a50-b5b3-6b1705d8ce4d-kube-api-access-chxzb\") pod \"cinder-operator-controller-manager-6f6c6946b9-gqb82\" (UID: \"e0c4310c-242a-4a50-b5b3-6b1705d8ce4d\") " pod="openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.211041 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.212256 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.216847 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-lpqt9" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.218080 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.222776 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.237478 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.263502 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.269753 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.271204 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.273679 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2j5jj" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.275288 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bkjk\" (UniqueName: \"kubernetes.io/projected/0805fa88-ea1a-4dec-b686-1024df504971-kube-api-access-6bkjk\") pod \"barbican-operator-controller-manager-6ff8b75857-n7dnn\" (UID: \"0805fa88-ea1a-4dec-b686-1024df504971\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.289295 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.290384 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.294140 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-f6xz2" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.300650 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.307596 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.308596 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.308854 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxzb\" (UniqueName: \"kubernetes.io/projected/e0c4310c-242a-4a50-b5b3-6b1705d8ce4d-kube-api-access-chxzb\") pod \"cinder-operator-controller-manager-6f6c6946b9-gqb82\" (UID: \"e0c4310c-242a-4a50-b5b3-6b1705d8ce4d\") " pod="openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.308936 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2scqp\" (UniqueName: \"kubernetes.io/projected/d574af81-6939-4076-8194-049c15ffb305-kube-api-access-2scqp\") pod \"horizon-operator-controller-manager-9f4696d94-6ggrv\" (UID: \"d574af81-6939-4076-8194-049c15ffb305\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.308967 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q262h\" (UniqueName: \"kubernetes.io/projected/bb6fcd5f-03e8-4d83-bfeb-6e91b2852548-kube-api-access-q262h\") pod \"designate-operator-controller-manager-84f4f7b77b-wfqz7\" (UID: \"bb6fcd5f-03e8-4d83-bfeb-6e91b2852548\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.308997 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8zkf\" (UniqueName: \"kubernetes.io/projected/f600efc9-bc85-4462-901d-10cb6ec3113c-kube-api-access-k8zkf\") pod \"heat-operator-controller-manager-5d889d78cf-lgp76\" (UID: \"f600efc9-bc85-4462-901d-10cb6ec3113c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.309704 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb6b5\" (UniqueName: \"kubernetes.io/projected/3c8953f0-3559-496e-a893-76a065eea629-kube-api-access-qb6b5\") pod \"glance-operator-controller-manager-84958c4d49-bwkr6\" (UID: \"3c8953f0-3559-496e-a893-76a065eea629\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.317631 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6mvdx" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.352110 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.368996 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.375701 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.381006 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxzb\" (UniqueName: \"kubernetes.io/projected/e0c4310c-242a-4a50-b5b3-6b1705d8ce4d-kube-api-access-chxzb\") pod \"cinder-operator-controller-manager-6f6c6946b9-gqb82\" (UID: \"e0c4310c-242a-4a50-b5b3-6b1705d8ce4d\") " pod="openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.396662 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.397999 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.401776 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-c4pmz" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.410842 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.412163 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z87rt\" (UniqueName: \"kubernetes.io/projected/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-kube-api-access-z87rt\") pod \"infra-operator-controller-manager-9d6c5db85-2bldp\" (UID: \"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.412216 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2scqp\" (UniqueName: \"kubernetes.io/projected/d574af81-6939-4076-8194-049c15ffb305-kube-api-access-2scqp\") pod \"horizon-operator-controller-manager-9f4696d94-6ggrv\" (UID: \"d574af81-6939-4076-8194-049c15ffb305\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.412270 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q262h\" (UniqueName: \"kubernetes.io/projected/bb6fcd5f-03e8-4d83-bfeb-6e91b2852548-kube-api-access-q262h\") pod \"designate-operator-controller-manager-84f4f7b77b-wfqz7\" (UID: \"bb6fcd5f-03e8-4d83-bfeb-6e91b2852548\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.412304 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvsgd\" (UniqueName: \"kubernetes.io/projected/53591aeb-91e2-4d05-b596-6b9d5b7dcd3f-kube-api-access-hvsgd\") pod \"keystone-operator-controller-manager-5bd55b4bff-z4qtt\" (UID: \"53591aeb-91e2-4d05-b596-6b9d5b7dcd3f\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.412336 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8zkf\" (UniqueName: \"kubernetes.io/projected/f600efc9-bc85-4462-901d-10cb6ec3113c-kube-api-access-k8zkf\") pod \"heat-operator-controller-manager-5d889d78cf-lgp76\" (UID: \"f600efc9-bc85-4462-901d-10cb6ec3113c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.412370 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert\") pod \"infra-operator-controller-manager-9d6c5db85-2bldp\" (UID: \"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.412405 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb6b5\" (UniqueName: \"kubernetes.io/projected/3c8953f0-3559-496e-a893-76a065eea629-kube-api-access-qb6b5\") pod \"glance-operator-controller-manager-84958c4d49-bwkr6\" (UID: \"3c8953f0-3559-496e-a893-76a065eea629\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.412433 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m6p5\" (UniqueName: \"kubernetes.io/projected/3267ccbe-611d-45a1-86fd-b901c6b52373-kube-api-access-6m6p5\") pod \"ironic-operator-controller-manager-5cd4858477-fqn5z\" (UID: \"3267ccbe-611d-45a1-86fd-b901c6b52373\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.412459 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vffcp\" (UniqueName: \"kubernetes.io/projected/b8bda3e4-db9e-4d2c-a352-71f1cde3536b-kube-api-access-vffcp\") pod \"mariadb-operator-controller-manager-88c7-sqbrr\" (UID: \"b8bda3e4-db9e-4d2c-a352-71f1cde3536b\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.412490 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw9dn\" (UniqueName: \"kubernetes.io/projected/574e5f9c-8e3f-4c9e-b562-ff40fa42ac3e-kube-api-access-dw9dn\") pod \"manila-operator-controller-manager-6d68dbc695-s85jn\" (UID: \"574e5f9c-8e3f-4c9e-b562-ff40fa42ac3e\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.431159 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.444377 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.445400 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.452890 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2scqp\" (UniqueName: \"kubernetes.io/projected/d574af81-6939-4076-8194-049c15ffb305-kube-api-access-2scqp\") pod \"horizon-operator-controller-manager-9f4696d94-6ggrv\" (UID: \"d574af81-6939-4076-8194-049c15ffb305\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.453435 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vc9rs" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.456356 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.457868 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.460770 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb6b5\" (UniqueName: \"kubernetes.io/projected/3c8953f0-3559-496e-a893-76a065eea629-kube-api-access-qb6b5\") pod \"glance-operator-controller-manager-84958c4d49-bwkr6\" (UID: \"3c8953f0-3559-496e-a893-76a065eea629\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.460889 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.462385 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-78dxg" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.462493 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8zkf\" (UniqueName: \"kubernetes.io/projected/f600efc9-bc85-4462-901d-10cb6ec3113c-kube-api-access-k8zkf\") pod \"heat-operator-controller-manager-5d889d78cf-lgp76\" (UID: \"f600efc9-bc85-4462-901d-10cb6ec3113c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.473451 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q262h\" (UniqueName: \"kubernetes.io/projected/bb6fcd5f-03e8-4d83-bfeb-6e91b2852548-kube-api-access-q262h\") pod \"designate-operator-controller-manager-84f4f7b77b-wfqz7\" (UID: \"bb6fcd5f-03e8-4d83-bfeb-6e91b2852548\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.484136 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.490818 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.492235 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.498693 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.504786 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tw9bq" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.513993 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvsgd\" (UniqueName: \"kubernetes.io/projected/53591aeb-91e2-4d05-b596-6b9d5b7dcd3f-kube-api-access-hvsgd\") pod \"keystone-operator-controller-manager-5bd55b4bff-z4qtt\" (UID: \"53591aeb-91e2-4d05-b596-6b9d5b7dcd3f\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.514075 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert\") pod \"infra-operator-controller-manager-9d6c5db85-2bldp\" (UID: \"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.514113 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m6p5\" (UniqueName: \"kubernetes.io/projected/3267ccbe-611d-45a1-86fd-b901c6b52373-kube-api-access-6m6p5\") pod \"ironic-operator-controller-manager-5cd4858477-fqn5z\" (UID: \"3267ccbe-611d-45a1-86fd-b901c6b52373\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.514140 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vffcp\" (UniqueName: \"kubernetes.io/projected/b8bda3e4-db9e-4d2c-a352-71f1cde3536b-kube-api-access-vffcp\") pod \"mariadb-operator-controller-manager-88c7-sqbrr\" (UID: \"b8bda3e4-db9e-4d2c-a352-71f1cde3536b\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.514171 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw9dn\" (UniqueName: \"kubernetes.io/projected/574e5f9c-8e3f-4c9e-b562-ff40fa42ac3e-kube-api-access-dw9dn\") pod \"manila-operator-controller-manager-6d68dbc695-s85jn\" (UID: \"574e5f9c-8e3f-4c9e-b562-ff40fa42ac3e\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.514202 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qln8q\" (UniqueName: \"kubernetes.io/projected/8b8872b0-5d5a-4934-b298-33b61782bd55-kube-api-access-qln8q\") pod \"octavia-operator-controller-manager-7b787867f4-zc9tx\" (UID: \"8b8872b0-5d5a-4934-b298-33b61782bd55\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.514255 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9czqz\" (UniqueName: \"kubernetes.io/projected/d6b4f40c-4be2-445b-ab93-583917fb3d1a-kube-api-access-9czqz\") pod \"neutron-operator-controller-manager-849d5b9b84-srxgs\" (UID: \"d6b4f40c-4be2-445b-ab93-583917fb3d1a\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.514289 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfm9r\" (UniqueName: \"kubernetes.io/projected/365587d7-12ca-4826-90c7-b56fac3ac05b-kube-api-access-pfm9r\") pod \"nova-operator-controller-manager-64cd67b5cb-xpmxm\" (UID: \"365587d7-12ca-4826-90c7-b56fac3ac05b\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.514328 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z87rt\" (UniqueName: \"kubernetes.io/projected/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-kube-api-access-z87rt\") pod \"infra-operator-controller-manager-9d6c5db85-2bldp\" (UID: \"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:17 crc kubenswrapper[4835]: E1002 11:11:17.514988 4835 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 02 11:11:17 crc kubenswrapper[4835]: E1002 11:11:17.515047 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert podName:971d8eb9-c70a-45b2-a7c3-20e6b62bbd48 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:18.015025811 +0000 UTC m=+954.574933392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert") pod "infra-operator-controller-manager-9d6c5db85-2bldp" (UID: "971d8eb9-c70a-45b2-a7c3-20e6b62bbd48") : secret "infra-operator-webhook-server-cert" not found Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.521330 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.525594 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.563280 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.564094 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.576490 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m6p5\" (UniqueName: \"kubernetes.io/projected/3267ccbe-611d-45a1-86fd-b901c6b52373-kube-api-access-6m6p5\") pod \"ironic-operator-controller-manager-5cd4858477-fqn5z\" (UID: \"3267ccbe-611d-45a1-86fd-b901c6b52373\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.576594 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvsgd\" (UniqueName: \"kubernetes.io/projected/53591aeb-91e2-4d05-b596-6b9d5b7dcd3f-kube-api-access-hvsgd\") pod \"keystone-operator-controller-manager-5bd55b4bff-z4qtt\" (UID: \"53591aeb-91e2-4d05-b596-6b9d5b7dcd3f\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.583192 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z87rt\" (UniqueName: \"kubernetes.io/projected/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-kube-api-access-z87rt\") pod \"infra-operator-controller-manager-9d6c5db85-2bldp\" (UID: \"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.586053 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vffcp\" (UniqueName: \"kubernetes.io/projected/b8bda3e4-db9e-4d2c-a352-71f1cde3536b-kube-api-access-vffcp\") pod \"mariadb-operator-controller-manager-88c7-sqbrr\" (UID: \"b8bda3e4-db9e-4d2c-a352-71f1cde3536b\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.587867 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw9dn\" (UniqueName: \"kubernetes.io/projected/574e5f9c-8e3f-4c9e-b562-ff40fa42ac3e-kube-api-access-dw9dn\") pod \"manila-operator-controller-manager-6d68dbc695-s85jn\" (UID: \"574e5f9c-8e3f-4c9e-b562-ff40fa42ac3e\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.615885 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qln8q\" (UniqueName: \"kubernetes.io/projected/8b8872b0-5d5a-4934-b298-33b61782bd55-kube-api-access-qln8q\") pod \"octavia-operator-controller-manager-7b787867f4-zc9tx\" (UID: \"8b8872b0-5d5a-4934-b298-33b61782bd55\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.615942 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9czqz\" (UniqueName: \"kubernetes.io/projected/d6b4f40c-4be2-445b-ab93-583917fb3d1a-kube-api-access-9czqz\") pod \"neutron-operator-controller-manager-849d5b9b84-srxgs\" (UID: \"d6b4f40c-4be2-445b-ab93-583917fb3d1a\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.615974 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfm9r\" (UniqueName: \"kubernetes.io/projected/365587d7-12ca-4826-90c7-b56fac3ac05b-kube-api-access-pfm9r\") pod \"nova-operator-controller-manager-64cd67b5cb-xpmxm\" (UID: \"365587d7-12ca-4826-90c7-b56fac3ac05b\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.619527 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.620408 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.636578 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.643120 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.657999 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.661641 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.662106 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-6q8t2" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.662617 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9czqz\" (UniqueName: \"kubernetes.io/projected/d6b4f40c-4be2-445b-ab93-583917fb3d1a-kube-api-access-9czqz\") pod \"neutron-operator-controller-manager-849d5b9b84-srxgs\" (UID: \"d6b4f40c-4be2-445b-ab93-583917fb3d1a\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.665092 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.672839 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.673078 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-fzm5g" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.692600 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfm9r\" (UniqueName: \"kubernetes.io/projected/365587d7-12ca-4826-90c7-b56fac3ac05b-kube-api-access-pfm9r\") pod \"nova-operator-controller-manager-64cd67b5cb-xpmxm\" (UID: \"365587d7-12ca-4826-90c7-b56fac3ac05b\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.696172 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qln8q\" (UniqueName: \"kubernetes.io/projected/8b8872b0-5d5a-4934-b298-33b61782bd55-kube-api-access-qln8q\") pod \"octavia-operator-controller-manager-7b787867f4-zc9tx\" (UID: \"8b8872b0-5d5a-4934-b298-33b61782bd55\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.704260 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.709951 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.724889 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-crgjg" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.810246 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.815712 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.819163 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.820249 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.822558 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-675kk\" (UniqueName: \"kubernetes.io/projected/33869fcf-8635-4c66-8364-4fb107c8930e-kube-api-access-675kk\") pod \"placement-operator-controller-manager-589c58c6c-qrdqd\" (UID: \"33869fcf-8635-4c66-8364-4fb107c8930e\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.831187 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpwpd\" (UniqueName: \"kubernetes.io/projected/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-kube-api-access-xpwpd\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-btrqt\" (UID: \"36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.831306 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-btrqt\" (UID: \"36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.831373 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kzvc\" (UniqueName: \"kubernetes.io/projected/593a4aeb-9c94-487c-bc8c-f234545762d6-kube-api-access-2kzvc\") pod \"ovn-operator-controller-manager-9976ff44c-2l6rr\" (UID: \"593a4aeb-9c94-487c-bc8c-f234545762d6\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.833209 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-nk4ct" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.838944 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.852045 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.868020 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.870343 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.876026 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.886075 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rnm7z" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.891598 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.911974 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.930602 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.934687 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.935064 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-btrqt\" (UID: \"36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.935154 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kzvc\" (UniqueName: \"kubernetes.io/projected/593a4aeb-9c94-487c-bc8c-f234545762d6-kube-api-access-2kzvc\") pod \"ovn-operator-controller-manager-9976ff44c-2l6rr\" (UID: \"593a4aeb-9c94-487c-bc8c-f234545762d6\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.935252 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwhvb\" (UniqueName: \"kubernetes.io/projected/ce0e3802-3a95-41a0-91cf-6584596b44ec-kube-api-access-lwhvb\") pod \"telemetry-operator-controller-manager-b8d54b5d7-x5w98\" (UID: \"ce0e3802-3a95-41a0-91cf-6584596b44ec\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.935294 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-675kk\" (UniqueName: \"kubernetes.io/projected/33869fcf-8635-4c66-8364-4fb107c8930e-kube-api-access-675kk\") pod \"placement-operator-controller-manager-589c58c6c-qrdqd\" (UID: \"33869fcf-8635-4c66-8364-4fb107c8930e\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.935329 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpwpd\" (UniqueName: \"kubernetes.io/projected/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-kube-api-access-xpwpd\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-btrqt\" (UID: \"36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:17 crc kubenswrapper[4835]: E1002 11:11:17.935877 4835 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 11:11:17 crc kubenswrapper[4835]: E1002 11:11:17.936072 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-cert podName:36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f nodeName:}" failed. No retries permitted until 2025-10-02 11:11:18.436041782 +0000 UTC m=+954.995949533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-cert") pod "openstack-baremetal-operator-controller-manager-5869cb545-btrqt" (UID: "36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.942715 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-kzmtv"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.945734 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.952358 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-gzdtf" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.964117 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kzvc\" (UniqueName: \"kubernetes.io/projected/593a4aeb-9c94-487c-bc8c-f234545762d6-kube-api-access-2kzvc\") pod \"ovn-operator-controller-manager-9976ff44c-2l6rr\" (UID: \"593a4aeb-9c94-487c-bc8c-f234545762d6\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.966985 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-675kk\" (UniqueName: \"kubernetes.io/projected/33869fcf-8635-4c66-8364-4fb107c8930e-kube-api-access-675kk\") pod \"placement-operator-controller-manager-589c58c6c-qrdqd\" (UID: \"33869fcf-8635-4c66-8364-4fb107c8930e\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.967494 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-kzmtv"] Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.978068 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpwpd\" (UniqueName: \"kubernetes.io/projected/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-kube-api-access-xpwpd\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-btrqt\" (UID: \"36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:17 crc kubenswrapper[4835]: I1002 11:11:17.979659 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.003004 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw"] Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.006251 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.012988 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5d5m9" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.032480 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw"] Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.037709 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert\") pod \"infra-operator-controller-manager-9d6c5db85-2bldp\" (UID: \"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.037769 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwhvb\" (UniqueName: \"kubernetes.io/projected/ce0e3802-3a95-41a0-91cf-6584596b44ec-kube-api-access-lwhvb\") pod \"telemetry-operator-controller-manager-b8d54b5d7-x5w98\" (UID: \"ce0e3802-3a95-41a0-91cf-6584596b44ec\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.037871 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxftd\" (UniqueName: \"kubernetes.io/projected/d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf-kube-api-access-dxftd\") pod \"swift-operator-controller-manager-84d6b4b759-42v6d\" (UID: \"d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" Oct 02 11:11:18 crc kubenswrapper[4835]: E1002 11:11:18.038057 4835 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 02 11:11:18 crc kubenswrapper[4835]: E1002 11:11:18.038123 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert podName:971d8eb9-c70a-45b2-a7c3-20e6b62bbd48 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:19.038097745 +0000 UTC m=+955.598005326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert") pod "infra-operator-controller-manager-9d6c5db85-2bldp" (UID: "971d8eb9-c70a-45b2-a7c3-20e6b62bbd48") : secret "infra-operator-webhook-server-cert" not found Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.059776 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.065372 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b"] Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.066892 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.069503 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-n9p2v" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.069578 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.079295 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b"] Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.093906 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwhvb\" (UniqueName: \"kubernetes.io/projected/ce0e3802-3a95-41a0-91cf-6584596b44ec-kube-api-access-lwhvb\") pod \"telemetry-operator-controller-manager-b8d54b5d7-x5w98\" (UID: \"ce0e3802-3a95-41a0-91cf-6584596b44ec\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.110795 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv"] Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.112558 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.117132 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-8ptv6" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.120861 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv"] Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.139053 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfl2h\" (UniqueName: \"kubernetes.io/projected/598297ec-cf43-432d-b7b9-67cc5c52ee46-kube-api-access-lfl2h\") pod \"watcher-operator-controller-manager-6b9957f54f-4nlhw\" (UID: \"598297ec-cf43-432d-b7b9-67cc5c52ee46\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.139139 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtxmq\" (UniqueName: \"kubernetes.io/projected/efd33f31-5093-4354-aa38-e40279007a57-kube-api-access-rtxmq\") pod \"test-operator-controller-manager-85777745bb-kzmtv\" (UID: \"efd33f31-5093-4354-aa38-e40279007a57\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.139173 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxftd\" (UniqueName: \"kubernetes.io/projected/d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf-kube-api-access-dxftd\") pod \"swift-operator-controller-manager-84d6b4b759-42v6d\" (UID: \"d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.179316 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxftd\" (UniqueName: \"kubernetes.io/projected/d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf-kube-api-access-dxftd\") pod \"swift-operator-controller-manager-84d6b4b759-42v6d\" (UID: \"d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.237969 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.264059 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b04cc853-44aa-4377-9c5d-339b1bbd0a78-cert\") pod \"openstack-operator-controller-manager-7b66d9b9d9-k6n7b\" (UID: \"b04cc853-44aa-4377-9c5d-339b1bbd0a78\") " pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.264178 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfl2h\" (UniqueName: \"kubernetes.io/projected/598297ec-cf43-432d-b7b9-67cc5c52ee46-kube-api-access-lfl2h\") pod \"watcher-operator-controller-manager-6b9957f54f-4nlhw\" (UID: \"598297ec-cf43-432d-b7b9-67cc5c52ee46\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.264212 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtxmq\" (UniqueName: \"kubernetes.io/projected/efd33f31-5093-4354-aa38-e40279007a57-kube-api-access-rtxmq\") pod \"test-operator-controller-manager-85777745bb-kzmtv\" (UID: \"efd33f31-5093-4354-aa38-e40279007a57\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.264245 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlnhj\" (UniqueName: \"kubernetes.io/projected/b04cc853-44aa-4377-9c5d-339b1bbd0a78-kube-api-access-nlnhj\") pod \"openstack-operator-controller-manager-7b66d9b9d9-k6n7b\" (UID: \"b04cc853-44aa-4377-9c5d-339b1bbd0a78\") " pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.264274 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcz22\" (UniqueName: \"kubernetes.io/projected/1293e5be-4d9a-40ca-81b8-576f674acd7c-kube-api-access-gcz22\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv\" (UID: \"1293e5be-4d9a-40ca-81b8-576f674acd7c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.266071 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.303599 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfl2h\" (UniqueName: \"kubernetes.io/projected/598297ec-cf43-432d-b7b9-67cc5c52ee46-kube-api-access-lfl2h\") pod \"watcher-operator-controller-manager-6b9957f54f-4nlhw\" (UID: \"598297ec-cf43-432d-b7b9-67cc5c52ee46\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.311210 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtxmq\" (UniqueName: \"kubernetes.io/projected/efd33f31-5093-4354-aa38-e40279007a57-kube-api-access-rtxmq\") pod \"test-operator-controller-manager-85777745bb-kzmtv\" (UID: \"efd33f31-5093-4354-aa38-e40279007a57\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.369033 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlnhj\" (UniqueName: \"kubernetes.io/projected/b04cc853-44aa-4377-9c5d-339b1bbd0a78-kube-api-access-nlnhj\") pod \"openstack-operator-controller-manager-7b66d9b9d9-k6n7b\" (UID: \"b04cc853-44aa-4377-9c5d-339b1bbd0a78\") " pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.369089 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcz22\" (UniqueName: \"kubernetes.io/projected/1293e5be-4d9a-40ca-81b8-576f674acd7c-kube-api-access-gcz22\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv\" (UID: \"1293e5be-4d9a-40ca-81b8-576f674acd7c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.369155 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b04cc853-44aa-4377-9c5d-339b1bbd0a78-cert\") pod \"openstack-operator-controller-manager-7b66d9b9d9-k6n7b\" (UID: \"b04cc853-44aa-4377-9c5d-339b1bbd0a78\") " pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" Oct 02 11:11:18 crc kubenswrapper[4835]: E1002 11:11:18.369335 4835 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 02 11:11:18 crc kubenswrapper[4835]: E1002 11:11:18.369404 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04cc853-44aa-4377-9c5d-339b1bbd0a78-cert podName:b04cc853-44aa-4377-9c5d-339b1bbd0a78 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:18.869380208 +0000 UTC m=+955.429287789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b04cc853-44aa-4377-9c5d-339b1bbd0a78-cert") pod "openstack-operator-controller-manager-7b66d9b9d9-k6n7b" (UID: "b04cc853-44aa-4377-9c5d-339b1bbd0a78") : secret "webhook-server-cert" not found Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.406142 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcz22\" (UniqueName: \"kubernetes.io/projected/1293e5be-4d9a-40ca-81b8-576f674acd7c-kube-api-access-gcz22\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv\" (UID: \"1293e5be-4d9a-40ca-81b8-576f674acd7c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.406644 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlnhj\" (UniqueName: \"kubernetes.io/projected/b04cc853-44aa-4377-9c5d-339b1bbd0a78-kube-api-access-nlnhj\") pod \"openstack-operator-controller-manager-7b66d9b9d9-k6n7b\" (UID: \"b04cc853-44aa-4377-9c5d-339b1bbd0a78\") " pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.463149 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.470188 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-btrqt\" (UID: \"36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:18 crc kubenswrapper[4835]: E1002 11:11:18.470413 4835 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 11:11:18 crc kubenswrapper[4835]: E1002 11:11:18.470476 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-cert podName:36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f nodeName:}" failed. No retries permitted until 2025-10-02 11:11:19.470456134 +0000 UTC m=+956.030363725 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-cert") pod "openstack-baremetal-operator-controller-manager-5869cb545-btrqt" (UID: "36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.507317 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.598115 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn"] Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.599009 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.753304 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7"] Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.830433 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6"] Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.849833 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82"] Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.887408 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b04cc853-44aa-4377-9c5d-339b1bbd0a78-cert\") pod \"openstack-operator-controller-manager-7b66d9b9d9-k6n7b\" (UID: \"b04cc853-44aa-4377-9c5d-339b1bbd0a78\") " pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.896187 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b04cc853-44aa-4377-9c5d-339b1bbd0a78-cert\") pod \"openstack-operator-controller-manager-7b66d9b9d9-k6n7b\" (UID: \"b04cc853-44aa-4377-9c5d-339b1bbd0a78\") " pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" Oct 02 11:11:18 crc kubenswrapper[4835]: W1002 11:11:18.961730 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0c4310c_242a_4a50_b5b3_6b1705d8ce4d.slice/crio-32643b28684fb13be39eacac460f068caad1072f4042d700197f315881c5bdeb WatchSource:0}: Error finding container 32643b28684fb13be39eacac460f068caad1072f4042d700197f315881c5bdeb: Status 404 returned error can't find the container with id 32643b28684fb13be39eacac460f068caad1072f4042d700197f315881c5bdeb Oct 02 11:11:18 crc kubenswrapper[4835]: I1002 11:11:18.964861 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.096553 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.098445 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.106107 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.106955 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert\") pod \"infra-operator-controller-manager-9d6c5db85-2bldp\" (UID: \"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:19 crc kubenswrapper[4835]: E1002 11:11:19.107127 4835 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 02 11:11:19 crc kubenswrapper[4835]: E1002 11:11:19.107466 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert podName:971d8eb9-c70a-45b2-a7c3-20e6b62bbd48 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:21.107434102 +0000 UTC m=+957.667341683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert") pod "infra-operator-controller-manager-9d6c5db85-2bldp" (UID: "971d8eb9-c70a-45b2-a7c3-20e6b62bbd48") : secret "infra-operator-webhook-server-cert" not found Oct 02 11:11:19 crc kubenswrapper[4835]: W1002 11:11:19.130704 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd574af81_6939_4076_8194_049c15ffb305.slice/crio-1397a5d75122890e5277dcb381b486091738dde1d59400a880015303f56faf64 WatchSource:0}: Error finding container 1397a5d75122890e5277dcb381b486091738dde1d59400a880015303f56faf64: Status 404 returned error can't find the container with id 1397a5d75122890e5277dcb381b486091738dde1d59400a880015303f56faf64 Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.310333 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.481163 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.491858 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.512284 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-btrqt\" (UID: \"36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.529140 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-btrqt\" (UID: \"36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.540930 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76" event={"ID":"f600efc9-bc85-4462-901d-10cb6ec3113c","Type":"ContainerStarted","Data":"6b2232ececba5dfda830d90bee6eec9afdad67990a28dda4e099a008a3136396"} Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.558562 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6" event={"ID":"3c8953f0-3559-496e-a893-76a065eea629","Type":"ContainerStarted","Data":"bea1736e710175951756127c16e176d7649e9170a9db74224cc2b4f4dbc54030"} Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.561356 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv" event={"ID":"d574af81-6939-4076-8194-049c15ffb305","Type":"ContainerStarted","Data":"1397a5d75122890e5277dcb381b486091738dde1d59400a880015303f56faf64"} Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.575465 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82" event={"ID":"e0c4310c-242a-4a50-b5b3-6b1705d8ce4d","Type":"ContainerStarted","Data":"32643b28684fb13be39eacac460f068caad1072f4042d700197f315881c5bdeb"} Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.579286 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z" event={"ID":"3267ccbe-611d-45a1-86fd-b901c6b52373","Type":"ContainerStarted","Data":"3c62a79dbd864367c3f762c5369ea6496cdab965f7aed770450c636a93b99e0b"} Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.581355 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7" event={"ID":"bb6fcd5f-03e8-4d83-bfeb-6e91b2852548","Type":"ContainerStarted","Data":"3c9b234dc7020e8cbfdb894934252708bba513c9a2923da9d1e02c0fc3ca4eae"} Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.587272 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt" event={"ID":"53591aeb-91e2-4d05-b596-6b9d5b7dcd3f","Type":"ContainerStarted","Data":"e4d77d454da478735c1b8281022043c303f7de67e68520dd08ed645208c88c22"} Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.592260 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn" event={"ID":"0805fa88-ea1a-4dec-b686-1024df504971","Type":"ContainerStarted","Data":"3609207effa7000223bee3ba545301c0973deda9efd74386cef4ea553104e7cf"} Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.592686 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.593401 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm" event={"ID":"365587d7-12ca-4826-90c7-b56fac3ac05b","Type":"ContainerStarted","Data":"1bde9c7d2d3944aedb21bbe79fe05b600a77c3f4e8e94335b3e3045890939de5"} Oct 02 11:11:19 crc kubenswrapper[4835]: W1002 11:11:19.602039 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b8872b0_5d5a_4934_b298_33b61782bd55.slice/crio-63920a4063bee9d407746a2831b17ed6f09fd798034077356a605f9cce94eae0 WatchSource:0}: Error finding container 63920a4063bee9d407746a2831b17ed6f09fd798034077356a605f9cce94eae0: Status 404 returned error can't find the container with id 63920a4063bee9d407746a2831b17ed6f09fd798034077356a605f9cce94eae0 Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.607667 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn"] Oct 02 11:11:19 crc kubenswrapper[4835]: W1002 11:11:19.617920 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod574e5f9c_8e3f_4c9e_b562_ff40fa42ac3e.slice/crio-ecf80017a42526ea24df8d10c52bcfd1a6cfeb41f38534236788bbd98110848b WatchSource:0}: Error finding container ecf80017a42526ea24df8d10c52bcfd1a6cfeb41f38534236788bbd98110848b: Status 404 returned error can't find the container with id ecf80017a42526ea24df8d10c52bcfd1a6cfeb41f38534236788bbd98110848b Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.767197 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.794990 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.805078 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.809156 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv"] Oct 02 11:11:19 crc kubenswrapper[4835]: W1002 11:11:19.835930 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod598297ec_cf43_432d_b7b9_67cc5c52ee46.slice/crio-e832082a8fa53bb0dc7b9324c56c337f7b94e586ba539d8d8d8a83bda4de0d3c WatchSource:0}: Error finding container e832082a8fa53bb0dc7b9324c56c337f7b94e586ba539d8d8d8a83bda4de0d3c: Status 404 returned error can't find the container with id e832082a8fa53bb0dc7b9324c56c337f7b94e586ba539d8d8d8a83bda4de0d3c Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.841404 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.849587 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d"] Oct 02 11:11:19 crc kubenswrapper[4835]: E1002 11:11:19.853053 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lfl2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6b9957f54f-4nlhw_openstack-operators(598297ec-cf43-432d-b7b9-67cc5c52ee46): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.858828 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-kzmtv"] Oct 02 11:11:19 crc kubenswrapper[4835]: E1002 11:11:19.863929 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dxftd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-84d6b4b759-42v6d_openstack-operators(d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:19 crc kubenswrapper[4835]: E1002 11:11:19.865654 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rtxmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-85777745bb-kzmtv_openstack-operators(efd33f31-5093-4354-aa38-e40279007a57): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:19 crc kubenswrapper[4835]: W1002 11:11:19.865835 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod593a4aeb_9c94_487c_bc8c_f234545762d6.slice/crio-e3732b47e9e31220522716e342f2adcfab773e0d30e056b6ceaeddd58599365f WatchSource:0}: Error finding container e3732b47e9e31220522716e342f2adcfab773e0d30e056b6ceaeddd58599365f: Status 404 returned error can't find the container with id e3732b47e9e31220522716e342f2adcfab773e0d30e056b6ceaeddd58599365f Oct 02 11:11:19 crc kubenswrapper[4835]: E1002 11:11:19.868245 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2kzvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-2l6rr_openstack-operators(593a4aeb-9c94-487c-bc8c-f234545762d6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.871422 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.879665 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98"] Oct 02 11:11:19 crc kubenswrapper[4835]: I1002 11:11:19.884812 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b"] Oct 02 11:11:19 crc kubenswrapper[4835]: W1002 11:11:19.890320 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8bda3e4_db9e_4d2c_a352_71f1cde3536b.slice/crio-c9fd311df1adf324d610e0e57b23c1d7b4e5d6326f6cb3d8404d24a321e832ae WatchSource:0}: Error finding container c9fd311df1adf324d610e0e57b23c1d7b4e5d6326f6cb3d8404d24a321e832ae: Status 404 returned error can't find the container with id c9fd311df1adf324d610e0e57b23c1d7b4e5d6326f6cb3d8404d24a321e832ae Oct 02 11:11:19 crc kubenswrapper[4835]: W1002 11:11:19.892787 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0e3802_3a95_41a0_91cf_6584596b44ec.slice/crio-bd69bc190c79e4b1c9e5bfca977cbd11609473e58dcba222b1b9004d9fcb7269 WatchSource:0}: Error finding container bd69bc190c79e4b1c9e5bfca977cbd11609473e58dcba222b1b9004d9fcb7269: Status 404 returned error can't find the container with id bd69bc190c79e4b1c9e5bfca977cbd11609473e58dcba222b1b9004d9fcb7269 Oct 02 11:11:19 crc kubenswrapper[4835]: W1002 11:11:19.897008 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb04cc853_44aa_4377_9c5d_339b1bbd0a78.slice/crio-57c9a73ae816169287a4eb084afc013c112854fb0ccfad58d535f83f5f432844 WatchSource:0}: Error finding container 57c9a73ae816169287a4eb084afc013c112854fb0ccfad58d535f83f5f432844: Status 404 returned error can't find the container with id 57c9a73ae816169287a4eb084afc013c112854fb0ccfad58d535f83f5f432844 Oct 02 11:11:19 crc kubenswrapper[4835]: E1002 11:11:19.898147 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lwhvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b8d54b5d7-x5w98_openstack-operators(ce0e3802-3a95-41a0-91cf-6584596b44ec): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:19 crc kubenswrapper[4835]: E1002 11:11:19.903475 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vffcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-88c7-sqbrr_openstack-operators(b8bda3e4-db9e-4d2c-a352-71f1cde3536b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.370735 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt"] Oct 02 11:11:20 crc kubenswrapper[4835]: E1002 11:11:20.577854 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" podUID="ce0e3802-3a95-41a0-91cf-6584596b44ec" Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.610747 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" event={"ID":"593a4aeb-9c94-487c-bc8c-f234545762d6","Type":"ContainerStarted","Data":"23579e7437468a9f0f5c843aa7a67f9fa2deea0d2ce956e53d58a45267247118"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.610802 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" event={"ID":"593a4aeb-9c94-487c-bc8c-f234545762d6","Type":"ContainerStarted","Data":"e3732b47e9e31220522716e342f2adcfab773e0d30e056b6ceaeddd58599365f"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.612533 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs" event={"ID":"d6b4f40c-4be2-445b-ab93-583917fb3d1a","Type":"ContainerStarted","Data":"954a7a7bf71bad4d309764208fa73d7b899e306d142d859e7d6de3628fa869f9"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.615303 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx" event={"ID":"8b8872b0-5d5a-4934-b298-33b61782bd55","Type":"ContainerStarted","Data":"63920a4063bee9d407746a2831b17ed6f09fd798034077356a605f9cce94eae0"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.616677 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn" event={"ID":"574e5f9c-8e3f-4c9e-b562-ff40fa42ac3e","Type":"ContainerStarted","Data":"ecf80017a42526ea24df8d10c52bcfd1a6cfeb41f38534236788bbd98110848b"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.619578 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd" event={"ID":"33869fcf-8635-4c66-8364-4fb107c8930e","Type":"ContainerStarted","Data":"27aeb4f65dea78f25248ed522680fb859bba835f6fbf0c412d627c2e95cc74b5"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.620870 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv" event={"ID":"1293e5be-4d9a-40ca-81b8-576f674acd7c","Type":"ContainerStarted","Data":"93e6f7025c58a46151085e19d2debe8a5cf2de704b32954dfa9383a1b1dddc7a"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.621945 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" event={"ID":"598297ec-cf43-432d-b7b9-67cc5c52ee46","Type":"ContainerStarted","Data":"cce8da66805e7b7ff92210287308293a7b43e6ad42e8360c792525997cdd790b"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.621968 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" event={"ID":"598297ec-cf43-432d-b7b9-67cc5c52ee46","Type":"ContainerStarted","Data":"e832082a8fa53bb0dc7b9324c56c337f7b94e586ba539d8d8d8a83bda4de0d3c"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.623894 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" event={"ID":"b04cc853-44aa-4377-9c5d-339b1bbd0a78","Type":"ContainerStarted","Data":"e624225059fc08f96096f2dff9df7e5c0b377834ecdac39d2987f46a22e9f2bf"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.623922 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" event={"ID":"b04cc853-44aa-4377-9c5d-339b1bbd0a78","Type":"ContainerStarted","Data":"57c9a73ae816169287a4eb084afc013c112854fb0ccfad58d535f83f5f432844"} Oct 02 11:11:20 crc kubenswrapper[4835]: E1002 11:11:20.639744 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" podUID="efd33f31-5093-4354-aa38-e40279007a57" Oct 02 11:11:20 crc kubenswrapper[4835]: E1002 11:11:20.644084 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" podUID="d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf" Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.644400 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" event={"ID":"d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf","Type":"ContainerStarted","Data":"b5c66b18dbc06acb51805bc5fc1fd8df029824d0a68fb2c3b725f37cfa8c75b6"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.644429 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" event={"ID":"d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf","Type":"ContainerStarted","Data":"bfa3473261e05002e6bff1d061f2668d248f44b8fc45868c23fb7ec287b143a5"} Oct 02 11:11:20 crc kubenswrapper[4835]: E1002 11:11:20.644575 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" podUID="b8bda3e4-db9e-4d2c-a352-71f1cde3536b" Oct 02 11:11:20 crc kubenswrapper[4835]: E1002 11:11:20.644809 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" podUID="593a4aeb-9c94-487c-bc8c-f234545762d6" Oct 02 11:11:20 crc kubenswrapper[4835]: E1002 11:11:20.644909 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" podUID="598297ec-cf43-432d-b7b9-67cc5c52ee46" Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.647457 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" event={"ID":"ce0e3802-3a95-41a0-91cf-6584596b44ec","Type":"ContainerStarted","Data":"1af018007bc4a5853848db4e211f90e6edf4ea36baddd865ffe0f3be4f7f6193"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.647486 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" event={"ID":"ce0e3802-3a95-41a0-91cf-6584596b44ec","Type":"ContainerStarted","Data":"bd69bc190c79e4b1c9e5bfca977cbd11609473e58dcba222b1b9004d9fcb7269"} Oct 02 11:11:20 crc kubenswrapper[4835]: E1002 11:11:20.663505 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" podUID="ce0e3802-3a95-41a0-91cf-6584596b44ec" Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.666150 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" event={"ID":"b8bda3e4-db9e-4d2c-a352-71f1cde3536b","Type":"ContainerStarted","Data":"9698ddd1b6164c54498706664ffbc9b9ab4731957ef4312fef1b9109f6d548b8"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.666241 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" event={"ID":"b8bda3e4-db9e-4d2c-a352-71f1cde3536b","Type":"ContainerStarted","Data":"c9fd311df1adf324d610e0e57b23c1d7b4e5d6326f6cb3d8404d24a321e832ae"} Oct 02 11:11:20 crc kubenswrapper[4835]: E1002 11:11:20.679559 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" podUID="b8bda3e4-db9e-4d2c-a352-71f1cde3536b" Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.682410 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" event={"ID":"36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f","Type":"ContainerStarted","Data":"3cc8125ce0538a3eee1a1654a0eb9632114a9aa1a0686958b1c3fd7924e54511"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.683989 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" event={"ID":"efd33f31-5093-4354-aa38-e40279007a57","Type":"ContainerStarted","Data":"a4974fee53a8fb0009542ca0f38fb4e9eb63a082493ed0e68291297194f560d1"} Oct 02 11:11:20 crc kubenswrapper[4835]: I1002 11:11:20.684017 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" event={"ID":"efd33f31-5093-4354-aa38-e40279007a57","Type":"ContainerStarted","Data":"5fabfb86b99790c8a6a7727da671da44ae5079801efbeeb5724ddd39bc42efc8"} Oct 02 11:11:20 crc kubenswrapper[4835]: E1002 11:11:20.696011 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" podUID="efd33f31-5093-4354-aa38-e40279007a57" Oct 02 11:11:21 crc kubenswrapper[4835]: I1002 11:11:21.150084 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert\") pod \"infra-operator-controller-manager-9d6c5db85-2bldp\" (UID: \"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:21 crc kubenswrapper[4835]: I1002 11:11:21.163101 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/971d8eb9-c70a-45b2-a7c3-20e6b62bbd48-cert\") pod \"infra-operator-controller-manager-9d6c5db85-2bldp\" (UID: \"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:21 crc kubenswrapper[4835]: I1002 11:11:21.192588 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:21 crc kubenswrapper[4835]: I1002 11:11:21.696809 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" event={"ID":"b04cc853-44aa-4377-9c5d-339b1bbd0a78","Type":"ContainerStarted","Data":"5ef926a2e36d6b7f1c4ff19ef705f6f79db1cc385d68bdea961b5437a220d89f"} Oct 02 11:11:21 crc kubenswrapper[4835]: I1002 11:11:21.697339 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" Oct 02 11:11:21 crc kubenswrapper[4835]: E1002 11:11:21.699261 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" podUID="ce0e3802-3a95-41a0-91cf-6584596b44ec" Oct 02 11:11:21 crc kubenswrapper[4835]: E1002 11:11:21.699306 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" podUID="efd33f31-5093-4354-aa38-e40279007a57" Oct 02 11:11:21 crc kubenswrapper[4835]: E1002 11:11:21.699546 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" podUID="593a4aeb-9c94-487c-bc8c-f234545762d6" Oct 02 11:11:21 crc kubenswrapper[4835]: E1002 11:11:21.699558 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" podUID="b8bda3e4-db9e-4d2c-a352-71f1cde3536b" Oct 02 11:11:21 crc kubenswrapper[4835]: E1002 11:11:21.699926 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" podUID="d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf" Oct 02 11:11:21 crc kubenswrapper[4835]: E1002 11:11:21.700356 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" podUID="598297ec-cf43-432d-b7b9-67cc5c52ee46" Oct 02 11:11:21 crc kubenswrapper[4835]: I1002 11:11:21.755809 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp"] Oct 02 11:11:21 crc kubenswrapper[4835]: I1002 11:11:21.917076 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" podStartSLOduration=4.91705447 podStartE2EDuration="4.91705447s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:21.90779429 +0000 UTC m=+958.467701881" watchObservedRunningTime="2025-10-02 11:11:21.91705447 +0000 UTC m=+958.476962071" Oct 02 11:11:22 crc kubenswrapper[4835]: E1002 11:11:22.709714 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" podUID="d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf" Oct 02 11:11:23 crc kubenswrapper[4835]: W1002 11:11:23.933641 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod971d8eb9_c70a_45b2_a7c3_20e6b62bbd48.slice/crio-53ec7844aa5821096d609fd5d28c979081960e46a61e6d28ccd11a7d020ce395 WatchSource:0}: Error finding container 53ec7844aa5821096d609fd5d28c979081960e46a61e6d28ccd11a7d020ce395: Status 404 returned error can't find the container with id 53ec7844aa5821096d609fd5d28c979081960e46a61e6d28ccd11a7d020ce395 Oct 02 11:11:24 crc kubenswrapper[4835]: I1002 11:11:24.718845 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" event={"ID":"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48","Type":"ContainerStarted","Data":"53ec7844aa5821096d609fd5d28c979081960e46a61e6d28ccd11a7d020ce395"} Oct 02 11:11:29 crc kubenswrapper[4835]: I1002 11:11:29.104744 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7b66d9b9d9-k6n7b" Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.875082 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82" event={"ID":"e0c4310c-242a-4a50-b5b3-6b1705d8ce4d","Type":"ContainerStarted","Data":"f4f434ee1b04b5e0e9c9d2d302a07a9632fb339161d04c0bc61c3db2482f93de"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.881465 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs" event={"ID":"d6b4f40c-4be2-445b-ab93-583917fb3d1a","Type":"ContainerStarted","Data":"ebe76d4b26224675f52bc49313196402669bcece5da57c58c156f0bf6358da1f"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.887677 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7" event={"ID":"bb6fcd5f-03e8-4d83-bfeb-6e91b2852548","Type":"ContainerStarted","Data":"23252312c4fe9fc528c84d09e704170af87bfe2f728f1e8003bc9f20d9be6ecc"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.908650 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z" event={"ID":"3267ccbe-611d-45a1-86fd-b901c6b52373","Type":"ContainerStarted","Data":"cefcc690b0fb3a4b4b22a2b40909e825d838a4ece43730e5d455f2ac73fe0bf3"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.918073 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn" event={"ID":"574e5f9c-8e3f-4c9e-b562-ff40fa42ac3e","Type":"ContainerStarted","Data":"eb0d91bdebe1cca12253b416f47902314d4a4295f59f434b67d2c97a2df105f9"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.918129 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn" event={"ID":"574e5f9c-8e3f-4c9e-b562-ff40fa42ac3e","Type":"ContainerStarted","Data":"82a5fd8c08b6cc30c22dab77d8d128cbd835635fe1aa5034a3071096c68325b1"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.918238 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn" Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.922297 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm" event={"ID":"365587d7-12ca-4826-90c7-b56fac3ac05b","Type":"ContainerStarted","Data":"e0360b5ec96b64f3a9f203409bda04c5767593fa688a65745330dd0f410c45ac"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.936594 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv" event={"ID":"d574af81-6939-4076-8194-049c15ffb305","Type":"ContainerStarted","Data":"75f4aa3d5d477c9c4253ab24f2ca4417ef2912fba3a3458fdd680215ef78642c"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.936680 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv" Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.939278 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76" event={"ID":"f600efc9-bc85-4462-901d-10cb6ec3113c","Type":"ContainerStarted","Data":"08c3991a9f2856ebfd4d4de64b13d31189f93db5cd2110d349ef3eb702a3e765"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.940709 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn" podStartSLOduration=4.018898464 podStartE2EDuration="17.940693087s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.620399622 +0000 UTC m=+956.180307203" lastFinishedPulling="2025-10-02 11:11:33.542194245 +0000 UTC m=+970.102101826" observedRunningTime="2025-10-02 11:11:34.938151576 +0000 UTC m=+971.498059167" watchObservedRunningTime="2025-10-02 11:11:34.940693087 +0000 UTC m=+971.500600668" Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.942804 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv" event={"ID":"1293e5be-4d9a-40ca-81b8-576f674acd7c","Type":"ContainerStarted","Data":"00d0432bc711feb428c668d24cf1e0f1b280af5867c4814c57dc5307db6c5096"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.955255 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" event={"ID":"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48","Type":"ContainerStarted","Data":"1af618ee6e890bcbc7f9c32b40730feb2315dfc0d92af3032c1303d5c3202583"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.962167 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn" event={"ID":"0805fa88-ea1a-4dec-b686-1024df504971","Type":"ContainerStarted","Data":"0bda535e563589802ff9cbdcbb6ee69559d27050f2c6c35ddd1e6a686b6d29ee"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.972678 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6" event={"ID":"3c8953f0-3559-496e-a893-76a065eea629","Type":"ContainerStarted","Data":"f6680098b08dc16e0bda45523478d62033f503ece8ee7cdc73c64978bfcbd4ca"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.973824 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv" podStartSLOduration=3.7335143779999997 podStartE2EDuration="17.973786996s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.139843581 +0000 UTC m=+955.699751162" lastFinishedPulling="2025-10-02 11:11:33.380116179 +0000 UTC m=+969.940023780" observedRunningTime="2025-10-02 11:11:34.959354841 +0000 UTC m=+971.519262422" watchObservedRunningTime="2025-10-02 11:11:34.973786996 +0000 UTC m=+971.533694577" Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.975210 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" event={"ID":"36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f","Type":"ContainerStarted","Data":"d44fe615591790c41891fe7d3a5d8692c23143ee1435fe358d060cb1c85af312"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.977975 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx" event={"ID":"8b8872b0-5d5a-4934-b298-33b61782bd55","Type":"ContainerStarted","Data":"8ac9bd79b99c520151062f1c8b645185fd9f39e5ebc0bf110ac14bbd57a8990d"} Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.978813 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx" Oct 02 11:11:34 crc kubenswrapper[4835]: I1002 11:11:34.991535 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv" podStartSLOduration=4.305991698 podStartE2EDuration="17.991458232s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.852677478 +0000 UTC m=+956.412585059" lastFinishedPulling="2025-10-02 11:11:33.538143962 +0000 UTC m=+970.098051593" observedRunningTime="2025-10-02 11:11:34.984029503 +0000 UTC m=+971.543937084" watchObservedRunningTime="2025-10-02 11:11:34.991458232 +0000 UTC m=+971.551365813" Oct 02 11:11:35 crc kubenswrapper[4835]: I1002 11:11:34.996282 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd" event={"ID":"33869fcf-8635-4c66-8364-4fb107c8930e","Type":"ContainerStarted","Data":"08c22fa4b27b6bb68ecab7dbb2aefe0fe893b57b90f9125bcc724f32f6543abb"} Oct 02 11:11:35 crc kubenswrapper[4835]: I1002 11:11:35.000956 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt" event={"ID":"53591aeb-91e2-4d05-b596-6b9d5b7dcd3f","Type":"ContainerStarted","Data":"2234194b015778fb66453b0ad1b2d7e8659fec28bbae80d79bf0da469dafe855"} Oct 02 11:11:35 crc kubenswrapper[4835]: I1002 11:11:35.042695 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx" podStartSLOduration=4.109756852 podStartE2EDuration="18.042677848s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.605074512 +0000 UTC m=+956.164982093" lastFinishedPulling="2025-10-02 11:11:33.537995508 +0000 UTC m=+970.097903089" observedRunningTime="2025-10-02 11:11:35.040794126 +0000 UTC m=+971.600701707" watchObservedRunningTime="2025-10-02 11:11:35.042677848 +0000 UTC m=+971.602585429" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.018086 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm" event={"ID":"365587d7-12ca-4826-90c7-b56fac3ac05b","Type":"ContainerStarted","Data":"6a7c2bbb508b8fc6a6c4a5db7a8af7bf38e6b98e537f4acaa53bab0d2afad45c"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.018518 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.025379 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs" event={"ID":"d6b4f40c-4be2-445b-ab93-583917fb3d1a","Type":"ContainerStarted","Data":"880931b616063401f79f001436f817555eb95d7ec6ba929afbb7a8d43624344e"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.025540 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.028195 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z" event={"ID":"3267ccbe-611d-45a1-86fd-b901c6b52373","Type":"ContainerStarted","Data":"5803f7b43d8950c57e7f85c414facde9b4993aae83a09d3956b9c9bc70228f36"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.028469 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.031996 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7" event={"ID":"bb6fcd5f-03e8-4d83-bfeb-6e91b2852548","Type":"ContainerStarted","Data":"fd87a7a18c1a30da99e4360f7deb47c181a51ec18a4e8ff25dc440e14a98e692"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.032124 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.041417 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm" podStartSLOduration=4.889966246 podStartE2EDuration="19.041364951s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.314619404 +0000 UTC m=+955.874526995" lastFinishedPulling="2025-10-02 11:11:33.466018119 +0000 UTC m=+970.025925700" observedRunningTime="2025-10-02 11:11:36.041198946 +0000 UTC m=+972.601106547" watchObservedRunningTime="2025-10-02 11:11:36.041364951 +0000 UTC m=+972.601272532" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.044735 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd" event={"ID":"33869fcf-8635-4c66-8364-4fb107c8930e","Type":"ContainerStarted","Data":"3c08fb29f444c40872fe83eb8aacc111aef4c6f8b732a73c0c20fae3943b9f98"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.044923 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.048034 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82" event={"ID":"e0c4310c-242a-4a50-b5b3-6b1705d8ce4d","Type":"ContainerStarted","Data":"2a20f3fd32a425d4837337f62db42d5173a0885b9a9163bd1d1e64cf19958fa9"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.048135 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.051606 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" event={"ID":"36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f","Type":"ContainerStarted","Data":"ab237f35554776640bbf7cce282ca978518c3429aa586b8d2b06c43a1750665e"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.051791 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.065950 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs" podStartSLOduration=5.316187327 podStartE2EDuration="19.065931794s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.788368984 +0000 UTC m=+956.348276565" lastFinishedPulling="2025-10-02 11:11:33.538113451 +0000 UTC m=+970.098021032" observedRunningTime="2025-10-02 11:11:36.062892317 +0000 UTC m=+972.622799888" watchObservedRunningTime="2025-10-02 11:11:36.065931794 +0000 UTC m=+972.625839375" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.069036 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx" event={"ID":"8b8872b0-5d5a-4934-b298-33b61782bd55","Type":"ContainerStarted","Data":"55d7c301f10bb9d3e7c5c2594eac09a572580286784052693401a3dbfb129d01"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.104450 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv" event={"ID":"d574af81-6939-4076-8194-049c15ffb305","Type":"ContainerStarted","Data":"3536171430aad0c630bcc6811e95c7459d125aa208615ece945943e9c2315a58"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.108489 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7" podStartSLOduration=4.758805255 podStartE2EDuration="19.108473131s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.030085912 +0000 UTC m=+955.589993493" lastFinishedPulling="2025-10-02 11:11:33.379753778 +0000 UTC m=+969.939661369" observedRunningTime="2025-10-02 11:11:36.103931601 +0000 UTC m=+972.663839182" watchObservedRunningTime="2025-10-02 11:11:36.108473131 +0000 UTC m=+972.668380712" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.124395 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76" event={"ID":"f600efc9-bc85-4462-901d-10cb6ec3113c","Type":"ContainerStarted","Data":"74ccf6f5eab4deb1af9007b963b9ff64bf275ae844c8f8b4d7a787d8426162af"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.124663 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.127212 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn" event={"ID":"0805fa88-ea1a-4dec-b686-1024df504971","Type":"ContainerStarted","Data":"fb0c250da6d1af1c194ee806beaccb73d26b2df89e963208c117cba22133fa97"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.127585 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.135863 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z" podStartSLOduration=4.718108209 podStartE2EDuration="19.135847875s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.119152471 +0000 UTC m=+955.679060052" lastFinishedPulling="2025-10-02 11:11:33.536892147 +0000 UTC m=+970.096799718" observedRunningTime="2025-10-02 11:11:36.131662775 +0000 UTC m=+972.691570366" watchObservedRunningTime="2025-10-02 11:11:36.135847875 +0000 UTC m=+972.695755456" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.147842 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt" event={"ID":"53591aeb-91e2-4d05-b596-6b9d5b7dcd3f","Type":"ContainerStarted","Data":"d8c9a3d94c184c2a165e226e5e2c5b07ea603a00440c51d174c8a09d067b5b62"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.148519 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.150212 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" event={"ID":"971d8eb9-c70a-45b2-a7c3-20e6b62bbd48","Type":"ContainerStarted","Data":"fb09ca05261aa8ea13705ebfb2223ad02b3d8c97d07db63f27866c914196964f"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.150604 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.170694 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76" podStartSLOduration=4.677836398 podStartE2EDuration="19.170673051s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.045415282 +0000 UTC m=+955.605322863" lastFinishedPulling="2025-10-02 11:11:33.538251925 +0000 UTC m=+970.098159516" observedRunningTime="2025-10-02 11:11:36.156665251 +0000 UTC m=+972.716572842" watchObservedRunningTime="2025-10-02 11:11:36.170673051 +0000 UTC m=+972.730580632" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.206981 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6" event={"ID":"3c8953f0-3559-496e-a893-76a065eea629","Type":"ContainerStarted","Data":"9c7945af84bc157c968c2081f8378f7a3473638be564302177d2db94bda8ba2a"} Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.208630 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.261898 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" podStartSLOduration=6.281251772 podStartE2EDuration="19.261877902s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:20.399125998 +0000 UTC m=+956.959033579" lastFinishedPulling="2025-10-02 11:11:33.379752118 +0000 UTC m=+969.939659709" observedRunningTime="2025-10-02 11:11:36.206801526 +0000 UTC m=+972.766709117" watchObservedRunningTime="2025-10-02 11:11:36.261877902 +0000 UTC m=+972.821785483" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.268609 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn" podStartSLOduration=4.973489259 podStartE2EDuration="19.268588954s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:18.938596376 +0000 UTC m=+955.498503957" lastFinishedPulling="2025-10-02 11:11:33.233696071 +0000 UTC m=+969.793603652" observedRunningTime="2025-10-02 11:11:36.239999956 +0000 UTC m=+972.799907537" watchObservedRunningTime="2025-10-02 11:11:36.268588954 +0000 UTC m=+972.828496535" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.282657 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82" podStartSLOduration=4.846581426 podStartE2EDuration="19.282638506s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.029739442 +0000 UTC m=+955.589647023" lastFinishedPulling="2025-10-02 11:11:33.465796522 +0000 UTC m=+970.025704103" observedRunningTime="2025-10-02 11:11:36.279502946 +0000 UTC m=+972.839410527" watchObservedRunningTime="2025-10-02 11:11:36.282638506 +0000 UTC m=+972.842546087" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.322428 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd" podStartSLOduration=5.29826306 podStartE2EDuration="19.322402784s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.520211022 +0000 UTC m=+956.080118603" lastFinishedPulling="2025-10-02 11:11:33.544350746 +0000 UTC m=+970.104258327" observedRunningTime="2025-10-02 11:11:36.315798885 +0000 UTC m=+972.875706476" watchObservedRunningTime="2025-10-02 11:11:36.322402784 +0000 UTC m=+972.882310365" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.351440 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt" podStartSLOduration=5.3302504729999995 podStartE2EDuration="19.351419514s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.519461051 +0000 UTC m=+956.079368642" lastFinishedPulling="2025-10-02 11:11:33.540630102 +0000 UTC m=+970.100537683" observedRunningTime="2025-10-02 11:11:36.349823659 +0000 UTC m=+972.909731250" watchObservedRunningTime="2025-10-02 11:11:36.351419514 +0000 UTC m=+972.911327095" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.384035 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6" podStartSLOduration=5.04607995 podStartE2EDuration="19.384008677s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.042545012 +0000 UTC m=+955.602452593" lastFinishedPulling="2025-10-02 11:11:33.380473739 +0000 UTC m=+969.940381320" observedRunningTime="2025-10-02 11:11:36.381523446 +0000 UTC m=+972.941431017" watchObservedRunningTime="2025-10-02 11:11:36.384008677 +0000 UTC m=+972.943916268" Oct 02 11:11:36 crc kubenswrapper[4835]: I1002 11:11:36.412858 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" podStartSLOduration=9.805658565 podStartE2EDuration="19.412834142s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:23.93542329 +0000 UTC m=+960.495330861" lastFinishedPulling="2025-10-02 11:11:33.542598857 +0000 UTC m=+970.102506438" observedRunningTime="2025-10-02 11:11:36.409772784 +0000 UTC m=+972.969680365" watchObservedRunningTime="2025-10-02 11:11:36.412834142 +0000 UTC m=+972.972741723" Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.324170 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" event={"ID":"ce0e3802-3a95-41a0-91cf-6584596b44ec","Type":"ContainerStarted","Data":"5856cc971ae7463d1c24e024b9ff6e0cd0b096d025ca7a304d9ed34b75e73e11"} Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.325886 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.329860 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" event={"ID":"598297ec-cf43-432d-b7b9-67cc5c52ee46","Type":"ContainerStarted","Data":"3dd8df50bab9af5eec81f756036252bb6e041366656e3cc653204c305f1ff2b7"} Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.330453 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.335796 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" event={"ID":"593a4aeb-9c94-487c-bc8c-f234545762d6","Type":"ContainerStarted","Data":"0d0fc39b9ab15e160f5f4718fc23fb93734d2bdccbd214c83796276940674647"} Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.336306 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.342485 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" event={"ID":"b8bda3e4-db9e-4d2c-a352-71f1cde3536b","Type":"ContainerStarted","Data":"114da6a1a212fa363bbd7f0cc4fa14e3d6aebb1a7173f7e9eb804d99e1bacdf9"} Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.342616 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" podStartSLOduration=4.017818199 podStartE2EDuration="22.342603331s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.897941918 +0000 UTC m=+956.457849499" lastFinishedPulling="2025-10-02 11:11:38.22272705 +0000 UTC m=+974.782634631" observedRunningTime="2025-10-02 11:11:39.340203322 +0000 UTC m=+975.900110903" watchObservedRunningTime="2025-10-02 11:11:39.342603331 +0000 UTC m=+975.902510912" Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.342789 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.367326 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" podStartSLOduration=3.943864409 podStartE2EDuration="22.367306068s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.868108821 +0000 UTC m=+956.428016402" lastFinishedPulling="2025-10-02 11:11:38.29155048 +0000 UTC m=+974.851458061" observedRunningTime="2025-10-02 11:11:39.36073925 +0000 UTC m=+975.920646841" watchObservedRunningTime="2025-10-02 11:11:39.367306068 +0000 UTC m=+975.927213669" Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.384723 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" podStartSLOduration=3.971315324 podStartE2EDuration="22.384702746s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.852850533 +0000 UTC m=+956.412758114" lastFinishedPulling="2025-10-02 11:11:38.266237955 +0000 UTC m=+974.826145536" observedRunningTime="2025-10-02 11:11:39.37856647 +0000 UTC m=+975.938474051" watchObservedRunningTime="2025-10-02 11:11:39.384702746 +0000 UTC m=+975.944610327" Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.400459 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" podStartSLOduration=4.07738694 podStartE2EDuration="22.400435146s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.903298118 +0000 UTC m=+956.463205699" lastFinishedPulling="2025-10-02 11:11:38.226346324 +0000 UTC m=+974.786253905" observedRunningTime="2025-10-02 11:11:39.397321387 +0000 UTC m=+975.957228988" watchObservedRunningTime="2025-10-02 11:11:39.400435146 +0000 UTC m=+975.960342727" Oct 02 11:11:39 crc kubenswrapper[4835]: I1002 11:11:39.801756 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-btrqt" Oct 02 11:11:40 crc kubenswrapper[4835]: I1002 11:11:40.350834 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" event={"ID":"efd33f31-5093-4354-aa38-e40279007a57","Type":"ContainerStarted","Data":"8be90d15e10c1a0e821d7e18bbc2cfabb3ba3804e18d316d5905d947d897703f"} Oct 02 11:11:40 crc kubenswrapper[4835]: I1002 11:11:40.352046 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" Oct 02 11:11:40 crc kubenswrapper[4835]: I1002 11:11:40.355147 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" event={"ID":"d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf","Type":"ContainerStarted","Data":"d3a5dd53e2e26bca2192d20ddd9a82063763d6105ba48723c4cbe293389761c0"} Oct 02 11:11:40 crc kubenswrapper[4835]: I1002 11:11:40.356234 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" Oct 02 11:11:40 crc kubenswrapper[4835]: I1002 11:11:40.371876 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" podStartSLOduration=3.144538484 podStartE2EDuration="23.371849808s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.865534169 +0000 UTC m=+956.425441750" lastFinishedPulling="2025-10-02 11:11:40.092845493 +0000 UTC m=+976.652753074" observedRunningTime="2025-10-02 11:11:40.368974575 +0000 UTC m=+976.928882176" watchObservedRunningTime="2025-10-02 11:11:40.371849808 +0000 UTC m=+976.931757389" Oct 02 11:11:40 crc kubenswrapper[4835]: I1002 11:11:40.393608 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" podStartSLOduration=3.11994467 podStartE2EDuration="23.393586249s" podCreationTimestamp="2025-10-02 11:11:17 +0000 UTC" firstStartedPulling="2025-10-02 11:11:19.86378915 +0000 UTC m=+956.423696731" lastFinishedPulling="2025-10-02 11:11:40.137430729 +0000 UTC m=+976.697338310" observedRunningTime="2025-10-02 11:11:40.390940753 +0000 UTC m=+976.950848334" watchObservedRunningTime="2025-10-02 11:11:40.393586249 +0000 UTC m=+976.953493830" Oct 02 11:11:41 crc kubenswrapper[4835]: I1002 11:11:41.200718 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-2bldp" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.383853 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n7dnn" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.413543 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6f6c6946b9-gqb82" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.486484 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-wfqz7" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.501716 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-bwkr6" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.530280 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lgp76" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.569131 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6ggrv" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.627878 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-z4qtt" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.639691 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-fqn5z" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.671155 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-s85jn" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.822724 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-sqbrr" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.823602 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-srxgs" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.854928 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-xpmxm" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.885874 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-zc9tx" Oct 02 11:11:47 crc kubenswrapper[4835]: I1002 11:11:47.983966 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-qrdqd" Oct 02 11:11:48 crc kubenswrapper[4835]: I1002 11:11:48.063213 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2l6rr" Oct 02 11:11:48 crc kubenswrapper[4835]: I1002 11:11:48.242091 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-x5w98" Oct 02 11:11:48 crc kubenswrapper[4835]: I1002 11:11:48.273435 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-42v6d" Oct 02 11:11:48 crc kubenswrapper[4835]: I1002 11:11:48.465773 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-4nlhw" Oct 02 11:11:48 crc kubenswrapper[4835]: I1002 11:11:48.603052 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-kzmtv" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.086030 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lqkzx"] Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.088031 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.090632 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.090926 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lzq5c" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.090801 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.093661 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.096515 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lqkzx"] Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.147044 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fpcs7"] Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.148826 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.158559 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.159434 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9p54\" (UniqueName: \"kubernetes.io/projected/596e7ffa-16a7-4ac2-9663-ebce2e11b409-kube-api-access-t9p54\") pod \"dnsmasq-dns-675f4bcbfc-lqkzx\" (UID: \"596e7ffa-16a7-4ac2-9663-ebce2e11b409\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.159496 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596e7ffa-16a7-4ac2-9663-ebce2e11b409-config\") pod \"dnsmasq-dns-675f4bcbfc-lqkzx\" (UID: \"596e7ffa-16a7-4ac2-9663-ebce2e11b409\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.164121 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fpcs7"] Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.260407 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zqx4\" (UniqueName: \"kubernetes.io/projected/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-kube-api-access-8zqx4\") pod \"dnsmasq-dns-78dd6ddcc-fpcs7\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.260464 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-config\") pod \"dnsmasq-dns-78dd6ddcc-fpcs7\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.260520 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fpcs7\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.260550 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9p54\" (UniqueName: \"kubernetes.io/projected/596e7ffa-16a7-4ac2-9663-ebce2e11b409-kube-api-access-t9p54\") pod \"dnsmasq-dns-675f4bcbfc-lqkzx\" (UID: \"596e7ffa-16a7-4ac2-9663-ebce2e11b409\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.260602 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596e7ffa-16a7-4ac2-9663-ebce2e11b409-config\") pod \"dnsmasq-dns-675f4bcbfc-lqkzx\" (UID: \"596e7ffa-16a7-4ac2-9663-ebce2e11b409\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.261936 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596e7ffa-16a7-4ac2-9663-ebce2e11b409-config\") pod \"dnsmasq-dns-675f4bcbfc-lqkzx\" (UID: \"596e7ffa-16a7-4ac2-9663-ebce2e11b409\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.279618 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9p54\" (UniqueName: \"kubernetes.io/projected/596e7ffa-16a7-4ac2-9663-ebce2e11b409-kube-api-access-t9p54\") pod \"dnsmasq-dns-675f4bcbfc-lqkzx\" (UID: \"596e7ffa-16a7-4ac2-9663-ebce2e11b409\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.362040 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-config\") pod \"dnsmasq-dns-78dd6ddcc-fpcs7\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.362132 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fpcs7\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.362306 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zqx4\" (UniqueName: \"kubernetes.io/projected/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-kube-api-access-8zqx4\") pod \"dnsmasq-dns-78dd6ddcc-fpcs7\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.363055 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fpcs7\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.363377 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-config\") pod \"dnsmasq-dns-78dd6ddcc-fpcs7\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.379450 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zqx4\" (UniqueName: \"kubernetes.io/projected/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-kube-api-access-8zqx4\") pod \"dnsmasq-dns-78dd6ddcc-fpcs7\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.412675 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.471408 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.727821 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fpcs7"] Oct 02 11:12:06 crc kubenswrapper[4835]: I1002 11:12:06.850789 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lqkzx"] Oct 02 11:12:06 crc kubenswrapper[4835]: W1002 11:12:06.851130 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod596e7ffa_16a7_4ac2_9663_ebce2e11b409.slice/crio-0593d59ea8064b2ad457f3e21dff88861bb7a69146e0abd5ca340bf21179d87e WatchSource:0}: Error finding container 0593d59ea8064b2ad457f3e21dff88861bb7a69146e0abd5ca340bf21179d87e: Status 404 returned error can't find the container with id 0593d59ea8064b2ad457f3e21dff88861bb7a69146e0abd5ca340bf21179d87e Oct 02 11:12:07 crc kubenswrapper[4835]: I1002 11:12:07.575577 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" event={"ID":"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad","Type":"ContainerStarted","Data":"f0e83eb8cbaf13a33f679591c72dffb7b3e9bec6fb3032d77652680d06ae093f"} Oct 02 11:12:07 crc kubenswrapper[4835]: I1002 11:12:07.579864 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" event={"ID":"596e7ffa-16a7-4ac2-9663-ebce2e11b409","Type":"ContainerStarted","Data":"0593d59ea8064b2ad457f3e21dff88861bb7a69146e0abd5ca340bf21179d87e"} Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.182549 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lqkzx"] Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.207870 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4ggz9"] Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.209173 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.218206 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4ggz9"] Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.328712 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4ggz9\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.328806 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rrj5\" (UniqueName: \"kubernetes.io/projected/5ba43879-dd06-48b6-ba75-9520108a075e-kube-api-access-4rrj5\") pod \"dnsmasq-dns-666b6646f7-4ggz9\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.328877 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-config\") pod \"dnsmasq-dns-666b6646f7-4ggz9\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.430820 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-config\") pod \"dnsmasq-dns-666b6646f7-4ggz9\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.430951 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4ggz9\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.431043 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rrj5\" (UniqueName: \"kubernetes.io/projected/5ba43879-dd06-48b6-ba75-9520108a075e-kube-api-access-4rrj5\") pod \"dnsmasq-dns-666b6646f7-4ggz9\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.432747 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-config\") pod \"dnsmasq-dns-666b6646f7-4ggz9\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.432830 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4ggz9\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.464033 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rrj5\" (UniqueName: \"kubernetes.io/projected/5ba43879-dd06-48b6-ba75-9520108a075e-kube-api-access-4rrj5\") pod \"dnsmasq-dns-666b6646f7-4ggz9\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.536123 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fpcs7"] Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.574038 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dqggc"] Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.575576 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.585034 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dqggc"] Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.591316 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.633493 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9gpb\" (UniqueName: \"kubernetes.io/projected/d24483b5-1674-48c0-8330-1a96574d3460-kube-api-access-d9gpb\") pod \"dnsmasq-dns-57d769cc4f-dqggc\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.633610 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-config\") pod \"dnsmasq-dns-57d769cc4f-dqggc\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.633649 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dqggc\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.734699 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9gpb\" (UniqueName: \"kubernetes.io/projected/d24483b5-1674-48c0-8330-1a96574d3460-kube-api-access-d9gpb\") pod \"dnsmasq-dns-57d769cc4f-dqggc\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.734789 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-config\") pod \"dnsmasq-dns-57d769cc4f-dqggc\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.734811 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dqggc\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.735867 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dqggc\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.736782 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-config\") pod \"dnsmasq-dns-57d769cc4f-dqggc\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.754482 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9gpb\" (UniqueName: \"kubernetes.io/projected/d24483b5-1674-48c0-8330-1a96574d3460-kube-api-access-d9gpb\") pod \"dnsmasq-dns-57d769cc4f-dqggc\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:09 crc kubenswrapper[4835]: I1002 11:12:09.895048 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.150724 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4ggz9"] Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.400331 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.402445 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.404932 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.405363 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gnqgj" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.405623 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.405733 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.405796 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.407041 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.407369 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.410468 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.554964 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75581788-2dfd-41d9-8500-0b4e3d050cab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.555034 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.555065 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-config-data\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.555095 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.555177 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.555206 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.555252 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.555278 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxkn4\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-kube-api-access-qxkn4\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.555305 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75581788-2dfd-41d9-8500-0b4e3d050cab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.555323 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.555343 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.656307 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.656466 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.656504 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxkn4\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-kube-api-access-qxkn4\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.656542 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75581788-2dfd-41d9-8500-0b4e3d050cab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.656567 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.656588 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.656635 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75581788-2dfd-41d9-8500-0b4e3d050cab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.656669 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.656690 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-config-data\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.656714 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.656767 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.657433 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.657429 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.657861 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.657897 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-config-data\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.657985 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.658620 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.662493 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.662527 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75581788-2dfd-41d9-8500-0b4e3d050cab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.665881 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75581788-2dfd-41d9-8500-0b4e3d050cab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.672811 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.673565 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxkn4\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-kube-api-access-qxkn4\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.679139 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.707774 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.709519 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.712682 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.712961 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.713135 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.714788 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cm8f9" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.714837 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.715008 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.717913 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.725414 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.756152 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.858952 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.858997 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.859016 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.859039 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.859072 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.859102 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdtl6\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-kube-api-access-pdtl6\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.859127 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.859151 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.859201 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.859260 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.859565 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.961027 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.961115 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.961180 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.961282 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.961305 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.961352 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.961376 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.961462 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.961522 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdtl6\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-kube-api-access-pdtl6\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.961555 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.961578 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.962173 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.963281 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.964160 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.965050 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.965895 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.966078 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.967837 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.969372 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.972712 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.980186 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.989602 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:10 crc kubenswrapper[4835]: I1002 11:12:10.999781 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdtl6\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-kube-api-access-pdtl6\") pod \"rabbitmq-cell1-server-0\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:11 crc kubenswrapper[4835]: I1002 11:12:11.050005 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:11 crc kubenswrapper[4835]: I1002 11:12:11.984382 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:12:11 crc kubenswrapper[4835]: I1002 11:12:11.984506 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:12:12 crc kubenswrapper[4835]: W1002 11:12:12.270513 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ba43879_dd06_48b6_ba75_9520108a075e.slice/crio-58abfc881760711f71f0fa49536007d28ca5f08ad870f29de18c36234bb02da7 WatchSource:0}: Error finding container 58abfc881760711f71f0fa49536007d28ca5f08ad870f29de18c36234bb02da7: Status 404 returned error can't find the container with id 58abfc881760711f71f0fa49536007d28ca5f08ad870f29de18c36234bb02da7 Oct 02 11:12:12 crc kubenswrapper[4835]: I1002 11:12:12.307619 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:12:12 crc kubenswrapper[4835]: I1002 11:12:12.641653 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" event={"ID":"5ba43879-dd06-48b6-ba75-9520108a075e","Type":"ContainerStarted","Data":"58abfc881760711f71f0fa49536007d28ca5f08ad870f29de18c36234bb02da7"} Oct 02 11:12:12 crc kubenswrapper[4835]: I1002 11:12:12.923275 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:12:12 crc kubenswrapper[4835]: I1002 11:12:12.925733 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:12:12 crc kubenswrapper[4835]: I1002 11:12:12.928071 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 02 11:12:12 crc kubenswrapper[4835]: I1002 11:12:12.929148 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 02 11:12:12 crc kubenswrapper[4835]: I1002 11:12:12.929151 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 02 11:12:12 crc kubenswrapper[4835]: I1002 11:12:12.929492 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 02 11:12:12 crc kubenswrapper[4835]: I1002 11:12:12.932289 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:12:12 crc kubenswrapper[4835]: I1002 11:12:12.933691 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-kjpbm" Oct 02 11:12:12 crc kubenswrapper[4835]: I1002 11:12:12.938503 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.003409 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed37190-5b25-4c46-a9af-9d2b07322f98-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.003536 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7fxp\" (UniqueName: \"kubernetes.io/projected/9ed37190-5b25-4c46-a9af-9d2b07322f98-kube-api-access-b7fxp\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.003562 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ed37190-5b25-4c46-a9af-9d2b07322f98-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.003582 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ed37190-5b25-4c46-a9af-9d2b07322f98-kolla-config\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.003889 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ed37190-5b25-4c46-a9af-9d2b07322f98-config-data-default\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.004014 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.004084 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9ed37190-5b25-4c46-a9af-9d2b07322f98-secrets\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.004114 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ed37190-5b25-4c46-a9af-9d2b07322f98-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.004153 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ed37190-5b25-4c46-a9af-9d2b07322f98-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.105167 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9ed37190-5b25-4c46-a9af-9d2b07322f98-secrets\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.105233 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ed37190-5b25-4c46-a9af-9d2b07322f98-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.105854 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ed37190-5b25-4c46-a9af-9d2b07322f98-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.106846 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ed37190-5b25-4c46-a9af-9d2b07322f98-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.105270 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ed37190-5b25-4c46-a9af-9d2b07322f98-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.106938 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed37190-5b25-4c46-a9af-9d2b07322f98-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.107410 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7fxp\" (UniqueName: \"kubernetes.io/projected/9ed37190-5b25-4c46-a9af-9d2b07322f98-kube-api-access-b7fxp\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.107443 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ed37190-5b25-4c46-a9af-9d2b07322f98-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.107460 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ed37190-5b25-4c46-a9af-9d2b07322f98-kolla-config\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.107516 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ed37190-5b25-4c46-a9af-9d2b07322f98-config-data-default\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.108111 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.108046 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ed37190-5b25-4c46-a9af-9d2b07322f98-kolla-config\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.108662 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ed37190-5b25-4c46-a9af-9d2b07322f98-config-data-default\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.109190 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.114427 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ed37190-5b25-4c46-a9af-9d2b07322f98-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.114857 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed37190-5b25-4c46-a9af-9d2b07322f98-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.128120 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9ed37190-5b25-4c46-a9af-9d2b07322f98-secrets\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.130007 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7fxp\" (UniqueName: \"kubernetes.io/projected/9ed37190-5b25-4c46-a9af-9d2b07322f98-kube-api-access-b7fxp\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.137214 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"9ed37190-5b25-4c46-a9af-9d2b07322f98\") " pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.261884 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.336265 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.337938 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.341671 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.341676 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.341796 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-hz9v5" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.341851 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.356009 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.429286 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1c80fb7-b373-407a-9024-6399def35365-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.429356 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1c80fb7-b373-407a-9024-6399def35365-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.429396 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c80fb7-b373-407a-9024-6399def35365-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.429447 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c80fb7-b373-407a-9024-6399def35365-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.429466 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.429483 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1c80fb7-b373-407a-9024-6399def35365-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.429517 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a1c80fb7-b373-407a-9024-6399def35365-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.429577 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1c80fb7-b373-407a-9024-6399def35365-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.429609 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxbj\" (UniqueName: \"kubernetes.io/projected/a1c80fb7-b373-407a-9024-6399def35365-kube-api-access-7sxbj\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.530837 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1c80fb7-b373-407a-9024-6399def35365-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.530894 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1c80fb7-b373-407a-9024-6399def35365-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.530927 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c80fb7-b373-407a-9024-6399def35365-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.530960 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c80fb7-b373-407a-9024-6399def35365-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.530979 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.531003 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1c80fb7-b373-407a-9024-6399def35365-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.531037 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a1c80fb7-b373-407a-9024-6399def35365-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.531090 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1c80fb7-b373-407a-9024-6399def35365-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.531120 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxbj\" (UniqueName: \"kubernetes.io/projected/a1c80fb7-b373-407a-9024-6399def35365-kube-api-access-7sxbj\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.532126 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1c80fb7-b373-407a-9024-6399def35365-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.532396 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1c80fb7-b373-407a-9024-6399def35365-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.533452 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1c80fb7-b373-407a-9024-6399def35365-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.536041 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.537498 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c80fb7-b373-407a-9024-6399def35365-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.538816 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c80fb7-b373-407a-9024-6399def35365-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.576333 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1c80fb7-b373-407a-9024-6399def35365-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.577624 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxbj\" (UniqueName: \"kubernetes.io/projected/a1c80fb7-b373-407a-9024-6399def35365-kube-api-access-7sxbj\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.577745 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a1c80fb7-b373-407a-9024-6399def35365-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.578776 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a1c80fb7-b373-407a-9024-6399def35365\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.659761 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.661551 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.664141 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.664198 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-gpsbh" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.667236 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.667586 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.672848 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.734790 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9m49\" (UniqueName: \"kubernetes.io/projected/fda9ea37-267e-46e9-b3ad-721123c57703-kube-api-access-r9m49\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.734849 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda9ea37-267e-46e9-b3ad-721123c57703-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.734868 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fda9ea37-267e-46e9-b3ad-721123c57703-config-data\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.734917 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fda9ea37-267e-46e9-b3ad-721123c57703-kolla-config\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.734944 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fda9ea37-267e-46e9-b3ad-721123c57703-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.836542 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fda9ea37-267e-46e9-b3ad-721123c57703-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.836638 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9m49\" (UniqueName: \"kubernetes.io/projected/fda9ea37-267e-46e9-b3ad-721123c57703-kube-api-access-r9m49\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.836664 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda9ea37-267e-46e9-b3ad-721123c57703-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.836685 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fda9ea37-267e-46e9-b3ad-721123c57703-config-data\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.836733 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fda9ea37-267e-46e9-b3ad-721123c57703-kolla-config\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.837510 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fda9ea37-267e-46e9-b3ad-721123c57703-kolla-config\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.837703 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fda9ea37-267e-46e9-b3ad-721123c57703-config-data\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.841025 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fda9ea37-267e-46e9-b3ad-721123c57703-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.841431 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda9ea37-267e-46e9-b3ad-721123c57703-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.857700 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9m49\" (UniqueName: \"kubernetes.io/projected/fda9ea37-267e-46e9-b3ad-721123c57703-kube-api-access-r9m49\") pod \"memcached-0\" (UID: \"fda9ea37-267e-46e9-b3ad-721123c57703\") " pod="openstack/memcached-0" Oct 02 11:12:13 crc kubenswrapper[4835]: I1002 11:12:13.992604 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:12:15 crc kubenswrapper[4835]: I1002 11:12:15.247581 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:12:15 crc kubenswrapper[4835]: I1002 11:12:15.248841 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:12:15 crc kubenswrapper[4835]: I1002 11:12:15.250713 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-fq7c7" Oct 02 11:12:15 crc kubenswrapper[4835]: I1002 11:12:15.257648 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:12:15 crc kubenswrapper[4835]: I1002 11:12:15.360248 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxjzt\" (UniqueName: \"kubernetes.io/projected/3de79574-cfeb-4000-881a-94f1e4e22235-kube-api-access-qxjzt\") pod \"kube-state-metrics-0\" (UID: \"3de79574-cfeb-4000-881a-94f1e4e22235\") " pod="openstack/kube-state-metrics-0" Oct 02 11:12:15 crc kubenswrapper[4835]: I1002 11:12:15.461378 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxjzt\" (UniqueName: \"kubernetes.io/projected/3de79574-cfeb-4000-881a-94f1e4e22235-kube-api-access-qxjzt\") pod \"kube-state-metrics-0\" (UID: \"3de79574-cfeb-4000-881a-94f1e4e22235\") " pod="openstack/kube-state-metrics-0" Oct 02 11:12:15 crc kubenswrapper[4835]: I1002 11:12:15.483363 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxjzt\" (UniqueName: \"kubernetes.io/projected/3de79574-cfeb-4000-881a-94f1e4e22235-kube-api-access-qxjzt\") pod \"kube-state-metrics-0\" (UID: \"3de79574-cfeb-4000-881a-94f1e4e22235\") " pod="openstack/kube-state-metrics-0" Oct 02 11:12:15 crc kubenswrapper[4835]: I1002 11:12:15.569812 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:12:16 crc kubenswrapper[4835]: I1002 11:12:16.936796 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dqggc"] Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.627684 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2tk75"] Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.629391 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.636831 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2tk75"] Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.642356 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rsptw" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.642378 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.642790 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.649019 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4bgdg"] Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.651258 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.659519 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4bgdg"] Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.682786 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-var-log-ovn\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.682966 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rptd\" (UniqueName: \"kubernetes.io/projected/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-kube-api-access-9rptd\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.683254 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-scripts\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.683290 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-combined-ca-bundle\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.683317 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-var-run-ovn\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.683508 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-var-run\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.683566 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-ovn-controller-tls-certs\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785245 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-var-lib\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785331 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-var-run\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785356 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-var-log\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785380 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-ovn-controller-tls-certs\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785411 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d68db426-57b9-479b-92d4-e4661cbd2711-scripts\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785435 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-var-log-ovn\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785457 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-var-run\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785493 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rptd\" (UniqueName: \"kubernetes.io/projected/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-kube-api-access-9rptd\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785530 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-scripts\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785548 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-var-run-ovn\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785566 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-combined-ca-bundle\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785605 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-etc-ovs\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.785623 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd9fp\" (UniqueName: \"kubernetes.io/projected/d68db426-57b9-479b-92d4-e4661cbd2711-kube-api-access-rd9fp\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.788017 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-scripts\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.790570 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-var-run\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.791119 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-var-log-ovn\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.791434 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-ovn-controller-tls-certs\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.791937 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-combined-ca-bundle\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.792154 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-var-run-ovn\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.808071 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rptd\" (UniqueName: \"kubernetes.io/projected/83798c14-4aa1-4530-82eb-fbe0cd6ceaf9-kube-api-access-9rptd\") pod \"ovn-controller-2tk75\" (UID: \"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9\") " pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.904527 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-etc-ovs\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.904585 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9fp\" (UniqueName: \"kubernetes.io/projected/d68db426-57b9-479b-92d4-e4661cbd2711-kube-api-access-rd9fp\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.904619 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-var-lib\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.904650 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-var-log\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.904696 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d68db426-57b9-479b-92d4-e4661cbd2711-scripts\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.904738 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-var-run\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.905037 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-var-run\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.905246 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-var-lib\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.905253 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-etc-ovs\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.905327 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d68db426-57b9-479b-92d4-e4661cbd2711-var-log\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.907201 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d68db426-57b9-479b-92d4-e4661cbd2711-scripts\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.926886 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9fp\" (UniqueName: \"kubernetes.io/projected/d68db426-57b9-479b-92d4-e4661cbd2711-kube-api-access-rd9fp\") pod \"ovn-controller-ovs-4bgdg\" (UID: \"d68db426-57b9-479b-92d4-e4661cbd2711\") " pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.959882 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2tk75" Oct 02 11:12:19 crc kubenswrapper[4835]: I1002 11:12:19.980786 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.524779 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.526431 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.531834 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.532237 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.533964 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.534199 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-xj9sf" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.534328 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.552997 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.616526 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.616637 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.616734 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.616838 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.616877 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvvhq\" (UniqueName: \"kubernetes.io/projected/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-kube-api-access-lvvhq\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.616947 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.617036 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.617087 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-config\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.718900 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.719000 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.719028 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.719068 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.719101 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvvhq\" (UniqueName: \"kubernetes.io/projected/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-kube-api-access-lvvhq\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.719141 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.719180 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.719206 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-config\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.719482 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.719782 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.720543 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-config\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.720552 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.728732 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.735538 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.735921 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.738202 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvvhq\" (UniqueName: \"kubernetes.io/projected/c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e-kube-api-access-lvvhq\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.739699 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:20 crc kubenswrapper[4835]: I1002 11:12:20.857331 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:21 crc kubenswrapper[4835]: I1002 11:12:21.766718 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" event={"ID":"d24483b5-1674-48c0-8330-1a96574d3460","Type":"ContainerStarted","Data":"c142484196cb99c749a7db033f1ece07f0a3c07f3f84a45d584384cc15dd4280"} Oct 02 11:12:21 crc kubenswrapper[4835]: I1002 11:12:21.958880 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.405252 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.407464 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.410140 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.410597 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-9vq4r" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.411082 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.412055 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.435390 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.464936 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48eefe46-521f-4c42-8796-ba131eae6a9e-config\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.465282 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48eefe46-521f-4c42-8796-ba131eae6a9e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.465481 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48eefe46-521f-4c42-8796-ba131eae6a9e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.465654 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48eefe46-521f-4c42-8796-ba131eae6a9e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.465842 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4spxf\" (UniqueName: \"kubernetes.io/projected/48eefe46-521f-4c42-8796-ba131eae6a9e-kube-api-access-4spxf\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.466066 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48eefe46-521f-4c42-8796-ba131eae6a9e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.466191 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48eefe46-521f-4c42-8796-ba131eae6a9e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.466454 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.568594 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48eefe46-521f-4c42-8796-ba131eae6a9e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.568653 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48eefe46-521f-4c42-8796-ba131eae6a9e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.568691 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4spxf\" (UniqueName: \"kubernetes.io/projected/48eefe46-521f-4c42-8796-ba131eae6a9e-kube-api-access-4spxf\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.568722 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48eefe46-521f-4c42-8796-ba131eae6a9e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.568740 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48eefe46-521f-4c42-8796-ba131eae6a9e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.568781 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.568820 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48eefe46-521f-4c42-8796-ba131eae6a9e-config\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.568840 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48eefe46-521f-4c42-8796-ba131eae6a9e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.569538 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.569925 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48eefe46-521f-4c42-8796-ba131eae6a9e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.570078 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48eefe46-521f-4c42-8796-ba131eae6a9e-config\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.570499 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48eefe46-521f-4c42-8796-ba131eae6a9e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.574574 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48eefe46-521f-4c42-8796-ba131eae6a9e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.574575 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48eefe46-521f-4c42-8796-ba131eae6a9e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.582057 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48eefe46-521f-4c42-8796-ba131eae6a9e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.586362 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4spxf\" (UniqueName: \"kubernetes.io/projected/48eefe46-521f-4c42-8796-ba131eae6a9e-kube-api-access-4spxf\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.590305 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48eefe46-521f-4c42-8796-ba131eae6a9e\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: I1002 11:12:23.742560 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:23 crc kubenswrapper[4835]: E1002 11:12:23.888863 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 11:12:23 crc kubenswrapper[4835]: E1002 11:12:23.889062 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8zqx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-fpcs7_openstack(c51cdb48-bdba-41e3-8e2f-17f7d95d93ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:12:23 crc kubenswrapper[4835]: E1002 11:12:23.890399 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 11:12:23 crc kubenswrapper[4835]: E1002 11:12:23.890494 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t9p54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-lqkzx_openstack(596e7ffa-16a7-4ac2-9663-ebce2e11b409): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:12:23 crc kubenswrapper[4835]: E1002 11:12:23.891161 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" podUID="c51cdb48-bdba-41e3-8e2f-17f7d95d93ad" Oct 02 11:12:23 crc kubenswrapper[4835]: E1002 11:12:23.891641 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" podUID="596e7ffa-16a7-4ac2-9663-ebce2e11b409" Oct 02 11:12:23 crc kubenswrapper[4835]: W1002 11:12:23.895906 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bc536a4_ef50_4d6d_aca5_a030ce38ce24.slice/crio-e48e57d1da6c9e9c0484b04a239321bc2b9cfefdf19ce428af30c5d39dfd9d41 WatchSource:0}: Error finding container e48e57d1da6c9e9c0484b04a239321bc2b9cfefdf19ce428af30c5d39dfd9d41: Status 404 returned error can't find the container with id e48e57d1da6c9e9c0484b04a239321bc2b9cfefdf19ce428af30c5d39dfd9d41 Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.314424 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.460983 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4bgdg"] Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.563355 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.572014 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2tk75"] Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.578035 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:12:24 crc kubenswrapper[4835]: W1002 11:12:24.585146 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ed37190_5b25_4c46_a9af_9d2b07322f98.slice/crio-8595d6dd99d00463b73a1e6e2a1125713e2639c694b149be5bd315ad04190177 WatchSource:0}: Error finding container 8595d6dd99d00463b73a1e6e2a1125713e2639c694b149be5bd315ad04190177: Status 404 returned error can't find the container with id 8595d6dd99d00463b73a1e6e2a1125713e2639c694b149be5bd315ad04190177 Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.585967 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:12:24 crc kubenswrapper[4835]: W1002 11:12:24.589208 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfda9ea37_267e_46e9_b3ad_721123c57703.slice/crio-6001b65027e350f8427339c4be00a7a925dce2d675bc3d19f53f11a03448eca8 WatchSource:0}: Error finding container 6001b65027e350f8427339c4be00a7a925dce2d675bc3d19f53f11a03448eca8: Status 404 returned error can't find the container with id 6001b65027e350f8427339c4be00a7a925dce2d675bc3d19f53f11a03448eca8 Oct 02 11:12:24 crc kubenswrapper[4835]: W1002 11:12:24.593296 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1c80fb7_b373_407a_9024_6399def35365.slice/crio-52250916291fbc3e50099af9b2ac918b1f8e37238ee83ef423d0114626c192ec WatchSource:0}: Error finding container 52250916291fbc3e50099af9b2ac918b1f8e37238ee83ef423d0114626c192ec: Status 404 returned error can't find the container with id 52250916291fbc3e50099af9b2ac918b1f8e37238ee83ef423d0114626c192ec Oct 02 11:12:24 crc kubenswrapper[4835]: W1002 11:12:24.673625 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1be9dcf_ac2d_467b_9c8a_2ea18fb8bf0e.slice/crio-627f2baf96cb0f015c6bc53151f49f88cf2c9ea6208faf7958f926962a91715c WatchSource:0}: Error finding container 627f2baf96cb0f015c6bc53151f49f88cf2c9ea6208faf7958f926962a91715c: Status 404 returned error can't find the container with id 627f2baf96cb0f015c6bc53151f49f88cf2c9ea6208faf7958f926962a91715c Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.675249 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.757181 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:12:24 crc kubenswrapper[4835]: W1002 11:12:24.764751 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3de79574_cfeb_4000_881a_94f1e4e22235.slice/crio-24e334d32b113214d8521f84fdc0e912672b542505dfc4bc49f671fced98566c WatchSource:0}: Error finding container 24e334d32b113214d8521f84fdc0e912672b542505dfc4bc49f671fced98566c: Status 404 returned error can't find the container with id 24e334d32b113214d8521f84fdc0e912672b542505dfc4bc49f671fced98566c Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.788779 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e","Type":"ContainerStarted","Data":"627f2baf96cb0f015c6bc53151f49f88cf2c9ea6208faf7958f926962a91715c"} Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.790532 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4bgdg" event={"ID":"d68db426-57b9-479b-92d4-e4661cbd2711","Type":"ContainerStarted","Data":"2ab39ddcdbe35a8d9a6a9aa31d5e6b96ff1ba5c421d1eaa0bc2b0505790eb374"} Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.795211 4835 generic.go:334] "Generic (PLEG): container finished" podID="d24483b5-1674-48c0-8330-1a96574d3460" containerID="8e06fecc92a63b1ed283a1c11cd046a2bb69edade2dda822c3c0d415f1ec61fb" exitCode=0 Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.795247 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" event={"ID":"d24483b5-1674-48c0-8330-1a96574d3460","Type":"ContainerDied","Data":"8e06fecc92a63b1ed283a1c11cd046a2bb69edade2dda822c3c0d415f1ec61fb"} Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.796915 4835 generic.go:334] "Generic (PLEG): container finished" podID="5ba43879-dd06-48b6-ba75-9520108a075e" containerID="d9d301340f4f6bed9c4eb34880a4cc646c792ffc2e5f4635feeeb3c9e692e008" exitCode=0 Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.796991 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" event={"ID":"5ba43879-dd06-48b6-ba75-9520108a075e","Type":"ContainerDied","Data":"d9d301340f4f6bed9c4eb34880a4cc646c792ffc2e5f4635feeeb3c9e692e008"} Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.801309 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9ed37190-5b25-4c46-a9af-9d2b07322f98","Type":"ContainerStarted","Data":"8595d6dd99d00463b73a1e6e2a1125713e2639c694b149be5bd315ad04190177"} Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.803249 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fda9ea37-267e-46e9-b3ad-721123c57703","Type":"ContainerStarted","Data":"6001b65027e350f8427339c4be00a7a925dce2d675bc3d19f53f11a03448eca8"} Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.804309 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75581788-2dfd-41d9-8500-0b4e3d050cab","Type":"ContainerStarted","Data":"d142e42e7bed7a3468134867521820f8354d04599870b2c2298e6578d13356e7"} Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.805812 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6bc536a4-ef50-4d6d-aca5-a030ce38ce24","Type":"ContainerStarted","Data":"e48e57d1da6c9e9c0484b04a239321bc2b9cfefdf19ce428af30c5d39dfd9d41"} Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.808489 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3de79574-cfeb-4000-881a-94f1e4e22235","Type":"ContainerStarted","Data":"24e334d32b113214d8521f84fdc0e912672b542505dfc4bc49f671fced98566c"} Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.810165 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a1c80fb7-b373-407a-9024-6399def35365","Type":"ContainerStarted","Data":"52250916291fbc3e50099af9b2ac918b1f8e37238ee83ef423d0114626c192ec"} Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.821344 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2tk75" event={"ID":"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9","Type":"ContainerStarted","Data":"e8535eefbe58788bf48e49029ec96890c5b15f6eabace239b91d0ff24fb9fae4"} Oct 02 11:12:24 crc kubenswrapper[4835]: I1002 11:12:24.934065 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:12:24 crc kubenswrapper[4835]: W1002 11:12:24.940991 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48eefe46_521f_4c42_8796_ba131eae6a9e.slice/crio-09bcd8f2c30409e7dd5daa4117ffd7d5db070a80fb52bdd4e874644c88994920 WatchSource:0}: Error finding container 09bcd8f2c30409e7dd5daa4117ffd7d5db070a80fb52bdd4e874644c88994920: Status 404 returned error can't find the container with id 09bcd8f2c30409e7dd5daa4117ffd7d5db070a80fb52bdd4e874644c88994920 Oct 02 11:12:25 crc kubenswrapper[4835]: E1002 11:12:25.059987 4835 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 02 11:12:25 crc kubenswrapper[4835]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/5ba43879-dd06-48b6-ba75-9520108a075e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 02 11:12:25 crc kubenswrapper[4835]: > podSandboxID="58abfc881760711f71f0fa49536007d28ca5f08ad870f29de18c36234bb02da7" Oct 02 11:12:25 crc kubenswrapper[4835]: E1002 11:12:25.061246 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 02 11:12:25 crc kubenswrapper[4835]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rrj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-4ggz9_openstack(5ba43879-dd06-48b6-ba75-9520108a075e): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/5ba43879-dd06-48b6-ba75-9520108a075e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 02 11:12:25 crc kubenswrapper[4835]: > logger="UnhandledError" Oct 02 11:12:25 crc kubenswrapper[4835]: E1002 11:12:25.062534 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/5ba43879-dd06-48b6-ba75-9520108a075e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" podUID="5ba43879-dd06-48b6-ba75-9520108a075e" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.163731 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.208490 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-dns-svc\") pod \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.208597 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zqx4\" (UniqueName: \"kubernetes.io/projected/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-kube-api-access-8zqx4\") pod \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.208629 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-config\") pod \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\" (UID: \"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad\") " Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.209380 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-config" (OuterVolumeSpecName: "config") pod "c51cdb48-bdba-41e3-8e2f-17f7d95d93ad" (UID: "c51cdb48-bdba-41e3-8e2f-17f7d95d93ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.209798 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c51cdb48-bdba-41e3-8e2f-17f7d95d93ad" (UID: "c51cdb48-bdba-41e3-8e2f-17f7d95d93ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.213093 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-kube-api-access-8zqx4" (OuterVolumeSpecName: "kube-api-access-8zqx4") pod "c51cdb48-bdba-41e3-8e2f-17f7d95d93ad" (UID: "c51cdb48-bdba-41e3-8e2f-17f7d95d93ad"). InnerVolumeSpecName "kube-api-access-8zqx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.247360 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.310275 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596e7ffa-16a7-4ac2-9663-ebce2e11b409-config\") pod \"596e7ffa-16a7-4ac2-9663-ebce2e11b409\" (UID: \"596e7ffa-16a7-4ac2-9663-ebce2e11b409\") " Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.310478 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9p54\" (UniqueName: \"kubernetes.io/projected/596e7ffa-16a7-4ac2-9663-ebce2e11b409-kube-api-access-t9p54\") pod \"596e7ffa-16a7-4ac2-9663-ebce2e11b409\" (UID: \"596e7ffa-16a7-4ac2-9663-ebce2e11b409\") " Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.310888 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zqx4\" (UniqueName: \"kubernetes.io/projected/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-kube-api-access-8zqx4\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.310905 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.310914 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.311099 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596e7ffa-16a7-4ac2-9663-ebce2e11b409-config" (OuterVolumeSpecName: "config") pod "596e7ffa-16a7-4ac2-9663-ebce2e11b409" (UID: "596e7ffa-16a7-4ac2-9663-ebce2e11b409"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.313484 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596e7ffa-16a7-4ac2-9663-ebce2e11b409-kube-api-access-t9p54" (OuterVolumeSpecName: "kube-api-access-t9p54") pod "596e7ffa-16a7-4ac2-9663-ebce2e11b409" (UID: "596e7ffa-16a7-4ac2-9663-ebce2e11b409"). InnerVolumeSpecName "kube-api-access-t9p54". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.412765 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9p54\" (UniqueName: \"kubernetes.io/projected/596e7ffa-16a7-4ac2-9663-ebce2e11b409-kube-api-access-t9p54\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.412803 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596e7ffa-16a7-4ac2-9663-ebce2e11b409-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.847293 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" event={"ID":"d24483b5-1674-48c0-8330-1a96574d3460","Type":"ContainerStarted","Data":"f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe"} Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.847635 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.850051 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"48eefe46-521f-4c42-8796-ba131eae6a9e","Type":"ContainerStarted","Data":"09bcd8f2c30409e7dd5daa4117ffd7d5db070a80fb52bdd4e874644c88994920"} Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.853417 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" event={"ID":"596e7ffa-16a7-4ac2-9663-ebce2e11b409","Type":"ContainerDied","Data":"0593d59ea8064b2ad457f3e21dff88861bb7a69146e0abd5ca340bf21179d87e"} Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.853580 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lqkzx" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.855798 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.856180 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fpcs7" event={"ID":"c51cdb48-bdba-41e3-8e2f-17f7d95d93ad","Type":"ContainerDied","Data":"f0e83eb8cbaf13a33f679591c72dffb7b3e9bec6fb3032d77652680d06ae093f"} Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.866016 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" podStartSLOduration=14.391524316 podStartE2EDuration="16.865998324s" podCreationTimestamp="2025-10-02 11:12:09 +0000 UTC" firstStartedPulling="2025-10-02 11:12:21.577635072 +0000 UTC m=+1018.137542653" lastFinishedPulling="2025-10-02 11:12:24.05210908 +0000 UTC m=+1020.612016661" observedRunningTime="2025-10-02 11:12:25.865594462 +0000 UTC m=+1022.425502043" watchObservedRunningTime="2025-10-02 11:12:25.865998324 +0000 UTC m=+1022.425905905" Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.925336 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fpcs7"] Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.935344 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fpcs7"] Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.959304 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lqkzx"] Oct 02 11:12:25 crc kubenswrapper[4835]: I1002 11:12:25.966460 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lqkzx"] Oct 02 11:12:26 crc kubenswrapper[4835]: I1002 11:12:26.268798 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596e7ffa-16a7-4ac2-9663-ebce2e11b409" path="/var/lib/kubelet/pods/596e7ffa-16a7-4ac2-9663-ebce2e11b409/volumes" Oct 02 11:12:26 crc kubenswrapper[4835]: I1002 11:12:26.269277 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c51cdb48-bdba-41e3-8e2f-17f7d95d93ad" path="/var/lib/kubelet/pods/c51cdb48-bdba-41e3-8e2f-17f7d95d93ad/volumes" Oct 02 11:12:34 crc kubenswrapper[4835]: I1002 11:12:34.898647 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:34 crc kubenswrapper[4835]: I1002 11:12:34.974813 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4ggz9"] Oct 02 11:12:36 crc kubenswrapper[4835]: E1002 11:12:36.835125 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Oct 02 11:12:36 crc kubenswrapper[4835]: E1002 11:12:36.835579 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68fh5b8hfh5bh658h55ch597h667h58fh575hdbh68ch57dh6ch79h65dh686h674hdfh689h55fh9bh664h9hc7h68bhb5h56bh6fh5fh54fhb8q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvvhq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:12:37 crc kubenswrapper[4835]: E1002 11:12:37.550267 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Oct 02 11:12:37 crc kubenswrapper[4835]: E1002 11:12:37.550741 4835 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Oct 02 11:12:37 crc kubenswrapper[4835]: E1002 11:12:37.550980 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxjzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(3de79574-cfeb-4000-881a-94f1e4e22235): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:12:37 crc kubenswrapper[4835]: E1002 11:12:37.552419 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="3de79574-cfeb-4000-881a-94f1e4e22235" Oct 02 11:12:37 crc kubenswrapper[4835]: I1002 11:12:37.993609 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a1c80fb7-b373-407a-9024-6399def35365","Type":"ContainerStarted","Data":"f2f8b69198502844e0354a023f76227527bbef86b57e88c8da7c6c67767ec7c9"} Oct 02 11:12:37 crc kubenswrapper[4835]: I1002 11:12:37.996370 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" podUID="5ba43879-dd06-48b6-ba75-9520108a075e" containerName="dnsmasq-dns" containerID="cri-o://24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd" gracePeriod=10 Oct 02 11:12:37 crc kubenswrapper[4835]: I1002 11:12:37.997714 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" event={"ID":"5ba43879-dd06-48b6-ba75-9520108a075e","Type":"ContainerStarted","Data":"24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd"} Oct 02 11:12:37 crc kubenswrapper[4835]: I1002 11:12:37.997765 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9ed37190-5b25-4c46-a9af-9d2b07322f98","Type":"ContainerStarted","Data":"14df3672f6cfcc491b8b8b320acf3c0231c7b956fe7b59052419a0f1c057949c"} Oct 02 11:12:37 crc kubenswrapper[4835]: I1002 11:12:37.997817 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:38 crc kubenswrapper[4835]: I1002 11:12:38.000540 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"48eefe46-521f-4c42-8796-ba131eae6a9e","Type":"ContainerStarted","Data":"9b19932562f4d8ccb4df00bbcdeb658a25cabd45b5c06c2a24699fbc35af8fc8"} Oct 02 11:12:38 crc kubenswrapper[4835]: E1002 11:12:38.001895 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="3de79574-cfeb-4000-881a-94f1e4e22235" Oct 02 11:12:38 crc kubenswrapper[4835]: I1002 11:12:38.053406 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" podStartSLOduration=17.362101713 podStartE2EDuration="29.053379692s" podCreationTimestamp="2025-10-02 11:12:09 +0000 UTC" firstStartedPulling="2025-10-02 11:12:12.307284849 +0000 UTC m=+1008.867192430" lastFinishedPulling="2025-10-02 11:12:23.998562828 +0000 UTC m=+1020.558470409" observedRunningTime="2025-10-02 11:12:38.03969965 +0000 UTC m=+1034.599607251" watchObservedRunningTime="2025-10-02 11:12:38.053379692 +0000 UTC m=+1034.613287273" Oct 02 11:12:38 crc kubenswrapper[4835]: I1002 11:12:38.470848 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:38 crc kubenswrapper[4835]: I1002 11:12:38.577981 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-config\") pod \"5ba43879-dd06-48b6-ba75-9520108a075e\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " Oct 02 11:12:38 crc kubenswrapper[4835]: I1002 11:12:38.578061 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-dns-svc\") pod \"5ba43879-dd06-48b6-ba75-9520108a075e\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " Oct 02 11:12:38 crc kubenswrapper[4835]: I1002 11:12:38.578111 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rrj5\" (UniqueName: \"kubernetes.io/projected/5ba43879-dd06-48b6-ba75-9520108a075e-kube-api-access-4rrj5\") pod \"5ba43879-dd06-48b6-ba75-9520108a075e\" (UID: \"5ba43879-dd06-48b6-ba75-9520108a075e\") " Oct 02 11:12:38 crc kubenswrapper[4835]: I1002 11:12:38.584048 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba43879-dd06-48b6-ba75-9520108a075e-kube-api-access-4rrj5" (OuterVolumeSpecName: "kube-api-access-4rrj5") pod "5ba43879-dd06-48b6-ba75-9520108a075e" (UID: "5ba43879-dd06-48b6-ba75-9520108a075e"). InnerVolumeSpecName "kube-api-access-4rrj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:38 crc kubenswrapper[4835]: I1002 11:12:38.680142 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rrj5\" (UniqueName: \"kubernetes.io/projected/5ba43879-dd06-48b6-ba75-9520108a075e-kube-api-access-4rrj5\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:38 crc kubenswrapper[4835]: I1002 11:12:38.979673 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ba43879-dd06-48b6-ba75-9520108a075e" (UID: "5ba43879-dd06-48b6-ba75-9520108a075e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:38 crc kubenswrapper[4835]: I1002 11:12:38.986095 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.009714 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-config" (OuterVolumeSpecName: "config") pod "5ba43879-dd06-48b6-ba75-9520108a075e" (UID: "5ba43879-dd06-48b6-ba75-9520108a075e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.018048 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2tk75" event={"ID":"83798c14-4aa1-4530-82eb-fbe0cd6ceaf9","Type":"ContainerStarted","Data":"b5719e5d8542b4e49e31f0ff6111ec4f5f16c0cc467a323c5e26d5f437655982"} Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.019405 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-2tk75" Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.022714 4835 generic.go:334] "Generic (PLEG): container finished" podID="5ba43879-dd06-48b6-ba75-9520108a075e" containerID="24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd" exitCode=0 Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.022780 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" event={"ID":"5ba43879-dd06-48b6-ba75-9520108a075e","Type":"ContainerDied","Data":"24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd"} Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.022803 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" event={"ID":"5ba43879-dd06-48b6-ba75-9520108a075e","Type":"ContainerDied","Data":"58abfc881760711f71f0fa49536007d28ca5f08ad870f29de18c36234bb02da7"} Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.022804 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4ggz9" Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.022820 4835 scope.go:117] "RemoveContainer" containerID="24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd" Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.027311 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fda9ea37-267e-46e9-b3ad-721123c57703","Type":"ContainerStarted","Data":"29f2d3c7a9a61a42bd4be2405b010a5fab31fbf07cdfd7f427eadf5899e31d65"} Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.027494 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.029833 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4bgdg" event={"ID":"d68db426-57b9-479b-92d4-e4661cbd2711","Type":"ContainerStarted","Data":"444ee4835511af67bd0abcd6e62afa94d1afe570704ef1a5c58c4d6c9328f518"} Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.048808 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2tk75" podStartSLOduration=7.5309271330000005 podStartE2EDuration="20.048762719s" podCreationTimestamp="2025-10-02 11:12:19 +0000 UTC" firstStartedPulling="2025-10-02 11:12:24.593706361 +0000 UTC m=+1021.153613942" lastFinishedPulling="2025-10-02 11:12:37.111541947 +0000 UTC m=+1033.671449528" observedRunningTime="2025-10-02 11:12:39.036345424 +0000 UTC m=+1035.596253025" watchObservedRunningTime="2025-10-02 11:12:39.048762719 +0000 UTC m=+1035.608670290" Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.068583 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.169113328 podStartE2EDuration="26.068557316s" podCreationTimestamp="2025-10-02 11:12:13 +0000 UTC" firstStartedPulling="2025-10-02 11:12:24.5954214 +0000 UTC m=+1021.155328981" lastFinishedPulling="2025-10-02 11:12:36.494865368 +0000 UTC m=+1033.054772969" observedRunningTime="2025-10-02 11:12:39.064695555 +0000 UTC m=+1035.624603136" watchObservedRunningTime="2025-10-02 11:12:39.068557316 +0000 UTC m=+1035.628464897" Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.087992 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba43879-dd06-48b6-ba75-9520108a075e-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.114119 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4ggz9"] Oct 02 11:12:39 crc kubenswrapper[4835]: I1002 11:12:39.121597 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4ggz9"] Oct 02 11:12:40 crc kubenswrapper[4835]: I1002 11:12:40.040951 4835 generic.go:334] "Generic (PLEG): container finished" podID="d68db426-57b9-479b-92d4-e4661cbd2711" containerID="444ee4835511af67bd0abcd6e62afa94d1afe570704ef1a5c58c4d6c9328f518" exitCode=0 Oct 02 11:12:40 crc kubenswrapper[4835]: I1002 11:12:40.041051 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4bgdg" event={"ID":"d68db426-57b9-479b-92d4-e4661cbd2711","Type":"ContainerDied","Data":"444ee4835511af67bd0abcd6e62afa94d1afe570704ef1a5c58c4d6c9328f518"} Oct 02 11:12:40 crc kubenswrapper[4835]: I1002 11:12:40.045047 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75581788-2dfd-41d9-8500-0b4e3d050cab","Type":"ContainerStarted","Data":"37458b76b2ffa63de5c4a5fa2995e72d004e9c5d9c332ebf588cdfc9661d408e"} Oct 02 11:12:40 crc kubenswrapper[4835]: I1002 11:12:40.048142 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6bc536a4-ef50-4d6d-aca5-a030ce38ce24","Type":"ContainerStarted","Data":"0ef1712aa04a0792cf4b9650430c005225c98aa32da76b73bd349ff12c229cbd"} Oct 02 11:12:40 crc kubenswrapper[4835]: I1002 11:12:40.234686 4835 scope.go:117] "RemoveContainer" containerID="d9d301340f4f6bed9c4eb34880a4cc646c792ffc2e5f4635feeeb3c9e692e008" Oct 02 11:12:40 crc kubenswrapper[4835]: I1002 11:12:40.262710 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ba43879-dd06-48b6-ba75-9520108a075e" path="/var/lib/kubelet/pods/5ba43879-dd06-48b6-ba75-9520108a075e/volumes" Oct 02 11:12:40 crc kubenswrapper[4835]: I1002 11:12:40.601396 4835 scope.go:117] "RemoveContainer" containerID="24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd" Oct 02 11:12:40 crc kubenswrapper[4835]: E1002 11:12:40.602556 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd\": container with ID starting with 24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd not found: ID does not exist" containerID="24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd" Oct 02 11:12:40 crc kubenswrapper[4835]: I1002 11:12:40.602628 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd"} err="failed to get container status \"24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd\": rpc error: code = NotFound desc = could not find container \"24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd\": container with ID starting with 24c14fdafbe7d49ae1f93ddd0fecdf4d5654b8868923326b4bda55b0e6e0e2dd not found: ID does not exist" Oct 02 11:12:40 crc kubenswrapper[4835]: I1002 11:12:40.602660 4835 scope.go:117] "RemoveContainer" containerID="d9d301340f4f6bed9c4eb34880a4cc646c792ffc2e5f4635feeeb3c9e692e008" Oct 02 11:12:40 crc kubenswrapper[4835]: E1002 11:12:40.603166 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d301340f4f6bed9c4eb34880a4cc646c792ffc2e5f4635feeeb3c9e692e008\": container with ID starting with d9d301340f4f6bed9c4eb34880a4cc646c792ffc2e5f4635feeeb3c9e692e008 not found: ID does not exist" containerID="d9d301340f4f6bed9c4eb34880a4cc646c792ffc2e5f4635feeeb3c9e692e008" Oct 02 11:12:40 crc kubenswrapper[4835]: I1002 11:12:40.603212 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d301340f4f6bed9c4eb34880a4cc646c792ffc2e5f4635feeeb3c9e692e008"} err="failed to get container status \"d9d301340f4f6bed9c4eb34880a4cc646c792ffc2e5f4635feeeb3c9e692e008\": rpc error: code = NotFound desc = could not find container \"d9d301340f4f6bed9c4eb34880a4cc646c792ffc2e5f4635feeeb3c9e692e008\": container with ID starting with d9d301340f4f6bed9c4eb34880a4cc646c792ffc2e5f4635feeeb3c9e692e008 not found: ID does not exist" Oct 02 11:12:40 crc kubenswrapper[4835]: E1002 11:12:40.870540 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e" Oct 02 11:12:41 crc kubenswrapper[4835]: I1002 11:12:41.061123 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4bgdg" event={"ID":"d68db426-57b9-479b-92d4-e4661cbd2711","Type":"ContainerStarted","Data":"30657c39bb1e80d797ad2947e2415a0783bc72fac0e8718d7bb53e75c5750c0c"} Oct 02 11:12:41 crc kubenswrapper[4835]: I1002 11:12:41.069310 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"48eefe46-521f-4c42-8796-ba131eae6a9e","Type":"ContainerStarted","Data":"89c8093eda75767fc90e43bbf4ef12984d6185687ed62a97875d1f8cb3f71558"} Oct 02 11:12:41 crc kubenswrapper[4835]: I1002 11:12:41.073566 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e","Type":"ContainerStarted","Data":"c19f8c0ce7fdcb524ccb6697025b656f8c922f02a6976223d84f1a35fd9d7f8f"} Oct 02 11:12:41 crc kubenswrapper[4835]: E1002 11:12:41.075492 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e" Oct 02 11:12:41 crc kubenswrapper[4835]: I1002 11:12:41.094665 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.339095753 podStartE2EDuration="19.094641882s" podCreationTimestamp="2025-10-02 11:12:22 +0000 UTC" firstStartedPulling="2025-10-02 11:12:24.946071115 +0000 UTC m=+1021.505978706" lastFinishedPulling="2025-10-02 11:12:40.701617244 +0000 UTC m=+1037.261524835" observedRunningTime="2025-10-02 11:12:41.091125191 +0000 UTC m=+1037.651032782" watchObservedRunningTime="2025-10-02 11:12:41.094641882 +0000 UTC m=+1037.654549463" Oct 02 11:12:41 crc kubenswrapper[4835]: I1002 11:12:41.743470 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:41 crc kubenswrapper[4835]: I1002 11:12:41.818039 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:41 crc kubenswrapper[4835]: I1002 11:12:41.983743 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:12:41 crc kubenswrapper[4835]: I1002 11:12:41.983834 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.086844 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4bgdg" event={"ID":"d68db426-57b9-479b-92d4-e4661cbd2711","Type":"ContainerStarted","Data":"00887a27373ecf3e7264605bae8a0c7c3cc48c5b52562113c30330f66be2e526"} Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.087979 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:42 crc kubenswrapper[4835]: E1002 11:12:42.089399 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.135624 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.139387 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4bgdg" podStartSLOduration=10.514592504 podStartE2EDuration="23.139366131s" podCreationTimestamp="2025-10-02 11:12:19 +0000 UTC" firstStartedPulling="2025-10-02 11:12:24.485781852 +0000 UTC m=+1021.045689433" lastFinishedPulling="2025-10-02 11:12:37.110555479 +0000 UTC m=+1033.670463060" observedRunningTime="2025-10-02 11:12:42.132507765 +0000 UTC m=+1038.692415346" watchObservedRunningTime="2025-10-02 11:12:42.139366131 +0000 UTC m=+1038.699273702" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.435802 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hcnw2"] Oct 02 11:12:42 crc kubenswrapper[4835]: E1002 11:12:42.437085 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba43879-dd06-48b6-ba75-9520108a075e" containerName="dnsmasq-dns" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.437103 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba43879-dd06-48b6-ba75-9520108a075e" containerName="dnsmasq-dns" Oct 02 11:12:42 crc kubenswrapper[4835]: E1002 11:12:42.437132 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba43879-dd06-48b6-ba75-9520108a075e" containerName="init" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.437342 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba43879-dd06-48b6-ba75-9520108a075e" containerName="init" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.437967 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba43879-dd06-48b6-ba75-9520108a075e" containerName="dnsmasq-dns" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.440133 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.447131 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.451067 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hcnw2"] Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.487511 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-fhwnp"] Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.489266 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.492804 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.543131 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fhwnp"] Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.551452 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5831d90-f0d5-47a7-9ff0-532a176b71d2-config\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.551503 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-config\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.551556 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6djt8\" (UniqueName: \"kubernetes.io/projected/b5831d90-f0d5-47a7-9ff0-532a176b71d2-kube-api-access-6djt8\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.551589 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b5831d90-f0d5-47a7-9ff0-532a176b71d2-ovs-rundir\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.551637 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b5831d90-f0d5-47a7-9ff0-532a176b71d2-ovn-rundir\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.551791 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5831d90-f0d5-47a7-9ff0-532a176b71d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.551915 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5831d90-f0d5-47a7-9ff0-532a176b71d2-combined-ca-bundle\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.551955 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j696j\" (UniqueName: \"kubernetes.io/projected/0859fb19-de1d-40d4-ab38-7e52e98d2391-kube-api-access-j696j\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.551985 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.552051 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.653522 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j696j\" (UniqueName: \"kubernetes.io/projected/0859fb19-de1d-40d4-ab38-7e52e98d2391-kube-api-access-j696j\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.654030 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.655070 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.655136 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.655751 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.655827 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5831d90-f0d5-47a7-9ff0-532a176b71d2-config\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.656412 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5831d90-f0d5-47a7-9ff0-532a176b71d2-config\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.656477 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-config\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.656562 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6djt8\" (UniqueName: \"kubernetes.io/projected/b5831d90-f0d5-47a7-9ff0-532a176b71d2-kube-api-access-6djt8\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.656613 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b5831d90-f0d5-47a7-9ff0-532a176b71d2-ovs-rundir\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.656851 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b5831d90-f0d5-47a7-9ff0-532a176b71d2-ovs-rundir\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.656855 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-config\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.656927 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b5831d90-f0d5-47a7-9ff0-532a176b71d2-ovn-rundir\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.656968 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5831d90-f0d5-47a7-9ff0-532a176b71d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.657035 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5831d90-f0d5-47a7-9ff0-532a176b71d2-combined-ca-bundle\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.657070 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b5831d90-f0d5-47a7-9ff0-532a176b71d2-ovn-rundir\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.662696 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5831d90-f0d5-47a7-9ff0-532a176b71d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.662962 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5831d90-f0d5-47a7-9ff0-532a176b71d2-combined-ca-bundle\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.678101 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j696j\" (UniqueName: \"kubernetes.io/projected/0859fb19-de1d-40d4-ab38-7e52e98d2391-kube-api-access-j696j\") pod \"dnsmasq-dns-7f896c8c65-hcnw2\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.678155 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6djt8\" (UniqueName: \"kubernetes.io/projected/b5831d90-f0d5-47a7-9ff0-532a176b71d2-kube-api-access-6djt8\") pod \"ovn-controller-metrics-fhwnp\" (UID: \"b5831d90-f0d5-47a7-9ff0-532a176b71d2\") " pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.780096 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.815938 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fhwnp" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.907415 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hcnw2"] Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.965856 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-cjkjv"] Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.967348 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.969550 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 02 11:12:42 crc kubenswrapper[4835]: I1002 11:12:42.996197 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-cjkjv"] Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.063700 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz5pn\" (UniqueName: \"kubernetes.io/projected/16ab9964-fa74-4216-a34d-37c0fc813a16-kube-api-access-vz5pn\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.064143 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.064170 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.064539 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.064652 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-config\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.098327 4835 generic.go:334] "Generic (PLEG): container finished" podID="a1c80fb7-b373-407a-9024-6399def35365" containerID="f2f8b69198502844e0354a023f76227527bbef86b57e88c8da7c6c67767ec7c9" exitCode=0 Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.098449 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a1c80fb7-b373-407a-9024-6399def35365","Type":"ContainerDied","Data":"f2f8b69198502844e0354a023f76227527bbef86b57e88c8da7c6c67767ec7c9"} Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.105494 4835 generic.go:334] "Generic (PLEG): container finished" podID="9ed37190-5b25-4c46-a9af-9d2b07322f98" containerID="14df3672f6cfcc491b8b8b320acf3c0231c7b956fe7b59052419a0f1c057949c" exitCode=0 Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.105959 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9ed37190-5b25-4c46-a9af-9d2b07322f98","Type":"ContainerDied","Data":"14df3672f6cfcc491b8b8b320acf3c0231c7b956fe7b59052419a0f1c057949c"} Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.106205 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.106335 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.167426 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.167487 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-config\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.167602 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz5pn\" (UniqueName: \"kubernetes.io/projected/16ab9964-fa74-4216-a34d-37c0fc813a16-kube-api-access-vz5pn\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.167624 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.167639 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.168765 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.171282 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.174329 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.184392 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-config\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.201958 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz5pn\" (UniqueName: \"kubernetes.io/projected/16ab9964-fa74-4216-a34d-37c0fc813a16-kube-api-access-vz5pn\") pod \"dnsmasq-dns-86db49b7ff-cjkjv\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.310579 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.488543 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hcnw2"] Oct 02 11:12:43 crc kubenswrapper[4835]: W1002 11:12:43.494639 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0859fb19_de1d_40d4_ab38_7e52e98d2391.slice/crio-5964c19721e0ce0721ac1346d9fffbeb2c56fdc998dfe5642c6d513a53fb0305 WatchSource:0}: Error finding container 5964c19721e0ce0721ac1346d9fffbeb2c56fdc998dfe5642c6d513a53fb0305: Status 404 returned error can't find the container with id 5964c19721e0ce0721ac1346d9fffbeb2c56fdc998dfe5642c6d513a53fb0305 Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.593135 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fhwnp"] Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.769943 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-cjkjv"] Oct 02 11:12:43 crc kubenswrapper[4835]: W1002 11:12:43.845960 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16ab9964_fa74_4216_a34d_37c0fc813a16.slice/crio-f36c3b7e302c054739fa86c57a0b1befc56c56e0e83669eeee58462a94669e1f WatchSource:0}: Error finding container f36c3b7e302c054739fa86c57a0b1befc56c56e0e83669eeee58462a94669e1f: Status 404 returned error can't find the container with id f36c3b7e302c054739fa86c57a0b1befc56c56e0e83669eeee58462a94669e1f Oct 02 11:12:43 crc kubenswrapper[4835]: I1002 11:12:43.995888 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.115855 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fhwnp" event={"ID":"b5831d90-f0d5-47a7-9ff0-532a176b71d2","Type":"ContainerStarted","Data":"48013e5ca1c232b4ac78017019e8537829251909e01e27c03db5975e2350b324"} Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.118360 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a1c80fb7-b373-407a-9024-6399def35365","Type":"ContainerStarted","Data":"5bcb9e37384976d6fb37aeaba55bf9516d8137a683fe75bb992717ad375b256d"} Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.120086 4835 generic.go:334] "Generic (PLEG): container finished" podID="0859fb19-de1d-40d4-ab38-7e52e98d2391" containerID="04a868deafc2f78b52b06d6f9ff45a0bf8602f05a2674e51deb06f982c94e194" exitCode=0 Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.120162 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" event={"ID":"0859fb19-de1d-40d4-ab38-7e52e98d2391","Type":"ContainerDied","Data":"04a868deafc2f78b52b06d6f9ff45a0bf8602f05a2674e51deb06f982c94e194"} Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.120198 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" event={"ID":"0859fb19-de1d-40d4-ab38-7e52e98d2391","Type":"ContainerStarted","Data":"5964c19721e0ce0721ac1346d9fffbeb2c56fdc998dfe5642c6d513a53fb0305"} Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.125332 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9ed37190-5b25-4c46-a9af-9d2b07322f98","Type":"ContainerStarted","Data":"2ef5cbfabde83a7be7c8708a8508c775fe382ab179a99fd8a30ffc7e0439e473"} Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.129290 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" event={"ID":"16ab9964-fa74-4216-a34d-37c0fc813a16","Type":"ContainerStarted","Data":"218ce625bcfca3e1dd61767139013b197951c21ba14f39c67e1cb361576de52f"} Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.129357 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" event={"ID":"16ab9964-fa74-4216-a34d-37c0fc813a16","Type":"ContainerStarted","Data":"f36c3b7e302c054739fa86c57a0b1befc56c56e0e83669eeee58462a94669e1f"} Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.154642 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.26150026 podStartE2EDuration="32.154613927s" podCreationTimestamp="2025-10-02 11:12:12 +0000 UTC" firstStartedPulling="2025-10-02 11:12:24.597240322 +0000 UTC m=+1021.157147903" lastFinishedPulling="2025-10-02 11:12:37.490353999 +0000 UTC m=+1034.050261570" observedRunningTime="2025-10-02 11:12:44.143378166 +0000 UTC m=+1040.703285747" watchObservedRunningTime="2025-10-02 11:12:44.154613927 +0000 UTC m=+1040.714521508" Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.292536 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.94947224 podStartE2EDuration="33.292514094s" podCreationTimestamp="2025-10-02 11:12:11 +0000 UTC" firstStartedPulling="2025-10-02 11:12:24.587921405 +0000 UTC m=+1021.147828976" lastFinishedPulling="2025-10-02 11:12:36.930963249 +0000 UTC m=+1033.490870830" observedRunningTime="2025-10-02 11:12:44.219819473 +0000 UTC m=+1040.779727054" watchObservedRunningTime="2025-10-02 11:12:44.292514094 +0000 UTC m=+1040.852421675" Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.416611 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.539400 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j696j\" (UniqueName: \"kubernetes.io/projected/0859fb19-de1d-40d4-ab38-7e52e98d2391-kube-api-access-j696j\") pod \"0859fb19-de1d-40d4-ab38-7e52e98d2391\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.539656 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-ovsdbserver-sb\") pod \"0859fb19-de1d-40d4-ab38-7e52e98d2391\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.539689 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-config\") pod \"0859fb19-de1d-40d4-ab38-7e52e98d2391\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.539720 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-dns-svc\") pod \"0859fb19-de1d-40d4-ab38-7e52e98d2391\" (UID: \"0859fb19-de1d-40d4-ab38-7e52e98d2391\") " Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.546650 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0859fb19-de1d-40d4-ab38-7e52e98d2391-kube-api-access-j696j" (OuterVolumeSpecName: "kube-api-access-j696j") pod "0859fb19-de1d-40d4-ab38-7e52e98d2391" (UID: "0859fb19-de1d-40d4-ab38-7e52e98d2391"). InnerVolumeSpecName "kube-api-access-j696j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.560879 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-config" (OuterVolumeSpecName: "config") pod "0859fb19-de1d-40d4-ab38-7e52e98d2391" (UID: "0859fb19-de1d-40d4-ab38-7e52e98d2391"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.560945 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0859fb19-de1d-40d4-ab38-7e52e98d2391" (UID: "0859fb19-de1d-40d4-ab38-7e52e98d2391"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.563074 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0859fb19-de1d-40d4-ab38-7e52e98d2391" (UID: "0859fb19-de1d-40d4-ab38-7e52e98d2391"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.642173 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.642215 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.642261 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0859fb19-de1d-40d4-ab38-7e52e98d2391-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:44 crc kubenswrapper[4835]: I1002 11:12:44.642272 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j696j\" (UniqueName: \"kubernetes.io/projected/0859fb19-de1d-40d4-ab38-7e52e98d2391-kube-api-access-j696j\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:45 crc kubenswrapper[4835]: I1002 11:12:45.137957 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fhwnp" event={"ID":"b5831d90-f0d5-47a7-9ff0-532a176b71d2","Type":"ContainerStarted","Data":"973c403b81712c13749ac7043261b66d7bdb38da3947d99d6fad802a2fbf2616"} Oct 02 11:12:45 crc kubenswrapper[4835]: I1002 11:12:45.140862 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" event={"ID":"0859fb19-de1d-40d4-ab38-7e52e98d2391","Type":"ContainerDied","Data":"5964c19721e0ce0721ac1346d9fffbeb2c56fdc998dfe5642c6d513a53fb0305"} Oct 02 11:12:45 crc kubenswrapper[4835]: I1002 11:12:45.140885 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hcnw2" Oct 02 11:12:45 crc kubenswrapper[4835]: I1002 11:12:45.140922 4835 scope.go:117] "RemoveContainer" containerID="04a868deafc2f78b52b06d6f9ff45a0bf8602f05a2674e51deb06f982c94e194" Oct 02 11:12:45 crc kubenswrapper[4835]: I1002 11:12:45.143251 4835 generic.go:334] "Generic (PLEG): container finished" podID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerID="218ce625bcfca3e1dd61767139013b197951c21ba14f39c67e1cb361576de52f" exitCode=0 Oct 02 11:12:45 crc kubenswrapper[4835]: I1002 11:12:45.143315 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" event={"ID":"16ab9964-fa74-4216-a34d-37c0fc813a16","Type":"ContainerDied","Data":"218ce625bcfca3e1dd61767139013b197951c21ba14f39c67e1cb361576de52f"} Oct 02 11:12:45 crc kubenswrapper[4835]: I1002 11:12:45.221023 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-fhwnp" podStartSLOduration=3.221004036 podStartE2EDuration="3.221004036s" podCreationTimestamp="2025-10-02 11:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:45.171315644 +0000 UTC m=+1041.731223225" watchObservedRunningTime="2025-10-02 11:12:45.221004036 +0000 UTC m=+1041.780911617" Oct 02 11:12:45 crc kubenswrapper[4835]: I1002 11:12:45.280894 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hcnw2"] Oct 02 11:12:45 crc kubenswrapper[4835]: I1002 11:12:45.295330 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hcnw2"] Oct 02 11:12:46 crc kubenswrapper[4835]: I1002 11:12:46.156588 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" event={"ID":"16ab9964-fa74-4216-a34d-37c0fc813a16","Type":"ContainerStarted","Data":"e9ba4d66278959154e7f401fd433752b90913cc2898571ab1ad680089f52f1cb"} Oct 02 11:12:46 crc kubenswrapper[4835]: I1002 11:12:46.156989 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:46 crc kubenswrapper[4835]: I1002 11:12:46.265074 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0859fb19-de1d-40d4-ab38-7e52e98d2391" path="/var/lib/kubelet/pods/0859fb19-de1d-40d4-ab38-7e52e98d2391/volumes" Oct 02 11:12:49 crc kubenswrapper[4835]: I1002 11:12:49.274466 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" podStartSLOduration=7.274447974 podStartE2EDuration="7.274447974s" podCreationTimestamp="2025-10-02 11:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:46.180878488 +0000 UTC m=+1042.740786129" watchObservedRunningTime="2025-10-02 11:12:49.274447974 +0000 UTC m=+1045.834355555" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.226077 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3de79574-cfeb-4000-881a-94f1e4e22235","Type":"ContainerStarted","Data":"1a1bdf6709c45dfa9e7349df3425d989560ad4c89e21fb01b0fa0de223be33a8"} Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.226754 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.250590 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.759601179 podStartE2EDuration="38.250569789s" podCreationTimestamp="2025-10-02 11:12:15 +0000 UTC" firstStartedPulling="2025-10-02 11:12:24.768057441 +0000 UTC m=+1021.327965022" lastFinishedPulling="2025-10-02 11:12:52.259026041 +0000 UTC m=+1048.818933632" observedRunningTime="2025-10-02 11:12:53.244464235 +0000 UTC m=+1049.804371816" watchObservedRunningTime="2025-10-02 11:12:53.250569789 +0000 UTC m=+1049.810477380" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.262933 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.263000 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.312782 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.318826 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.374982 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dqggc"] Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.375663 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" podUID="d24483b5-1674-48c0-8330-1a96574d3460" containerName="dnsmasq-dns" containerID="cri-o://f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe" gracePeriod=10 Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.668821 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.669154 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.727199 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.835083 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.913426 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-dns-svc\") pod \"d24483b5-1674-48c0-8330-1a96574d3460\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.913554 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9gpb\" (UniqueName: \"kubernetes.io/projected/d24483b5-1674-48c0-8330-1a96574d3460-kube-api-access-d9gpb\") pod \"d24483b5-1674-48c0-8330-1a96574d3460\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.913626 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-config\") pod \"d24483b5-1674-48c0-8330-1a96574d3460\" (UID: \"d24483b5-1674-48c0-8330-1a96574d3460\") " Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.919421 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d24483b5-1674-48c0-8330-1a96574d3460-kube-api-access-d9gpb" (OuterVolumeSpecName: "kube-api-access-d9gpb") pod "d24483b5-1674-48c0-8330-1a96574d3460" (UID: "d24483b5-1674-48c0-8330-1a96574d3460"). InnerVolumeSpecName "kube-api-access-d9gpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.955724 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d24483b5-1674-48c0-8330-1a96574d3460" (UID: "d24483b5-1674-48c0-8330-1a96574d3460"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:53 crc kubenswrapper[4835]: I1002 11:12:53.956081 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-config" (OuterVolumeSpecName: "config") pod "d24483b5-1674-48c0-8330-1a96574d3460" (UID: "d24483b5-1674-48c0-8330-1a96574d3460"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.015828 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.015870 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9gpb\" (UniqueName: \"kubernetes.io/projected/d24483b5-1674-48c0-8330-1a96574d3460-kube-api-access-d9gpb\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.015881 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24483b5-1674-48c0-8330-1a96574d3460-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.235379 4835 generic.go:334] "Generic (PLEG): container finished" podID="d24483b5-1674-48c0-8330-1a96574d3460" containerID="f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe" exitCode=0 Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.236428 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" event={"ID":"d24483b5-1674-48c0-8330-1a96574d3460","Type":"ContainerDied","Data":"f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe"} Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.236477 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" event={"ID":"d24483b5-1674-48c0-8330-1a96574d3460","Type":"ContainerDied","Data":"c142484196cb99c749a7db033f1ece07f0a3c07f3f84a45d584384cc15dd4280"} Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.236506 4835 scope.go:117] "RemoveContainer" containerID="f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.236617 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dqggc" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.256320 4835 scope.go:117] "RemoveContainer" containerID="8e06fecc92a63b1ed283a1c11cd046a2bb69edade2dda822c3c0d415f1ec61fb" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.270830 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dqggc"] Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.278849 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dqggc"] Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.296850 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.296915 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.299285 4835 scope.go:117] "RemoveContainer" containerID="f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe" Oct 02 11:12:54 crc kubenswrapper[4835]: E1002 11:12:54.299811 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe\": container with ID starting with f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe not found: ID does not exist" containerID="f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.299881 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe"} err="failed to get container status \"f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe\": rpc error: code = NotFound desc = could not find container \"f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe\": container with ID starting with f7b61b826053c7797235900fc8bdb627c306b70b67526e3d523a1d46525757fe not found: ID does not exist" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.299950 4835 scope.go:117] "RemoveContainer" containerID="8e06fecc92a63b1ed283a1c11cd046a2bb69edade2dda822c3c0d415f1ec61fb" Oct 02 11:12:54 crc kubenswrapper[4835]: E1002 11:12:54.300597 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e06fecc92a63b1ed283a1c11cd046a2bb69edade2dda822c3c0d415f1ec61fb\": container with ID starting with 8e06fecc92a63b1ed283a1c11cd046a2bb69edade2dda822c3c0d415f1ec61fb not found: ID does not exist" containerID="8e06fecc92a63b1ed283a1c11cd046a2bb69edade2dda822c3c0d415f1ec61fb" Oct 02 11:12:54 crc kubenswrapper[4835]: I1002 11:12:54.300641 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e06fecc92a63b1ed283a1c11cd046a2bb69edade2dda822c3c0d415f1ec61fb"} err="failed to get container status \"8e06fecc92a63b1ed283a1c11cd046a2bb69edade2dda822c3c0d415f1ec61fb\": rpc error: code = NotFound desc = could not find container \"8e06fecc92a63b1ed283a1c11cd046a2bb69edade2dda822c3c0d415f1ec61fb\": container with ID starting with 8e06fecc92a63b1ed283a1c11cd046a2bb69edade2dda822c3c0d415f1ec61fb not found: ID does not exist" Oct 02 11:12:56 crc kubenswrapper[4835]: I1002 11:12:56.263618 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d24483b5-1674-48c0-8330-1a96574d3460" path="/var/lib/kubelet/pods/d24483b5-1674-48c0-8330-1a96574d3460/volumes" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.313713 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e","Type":"ContainerStarted","Data":"07a2f748d394d7828bdfd3cccf24f38d2840d09371f24f5731a7311844a18386"} Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.362646 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.366180878 podStartE2EDuration="44.362597302s" podCreationTimestamp="2025-10-02 11:12:19 +0000 UTC" firstStartedPulling="2025-10-02 11:12:24.676180771 +0000 UTC m=+1021.236088372" lastFinishedPulling="2025-10-02 11:13:02.672597215 +0000 UTC m=+1059.232504796" observedRunningTime="2025-10-02 11:13:03.33910634 +0000 UTC m=+1059.899013921" watchObservedRunningTime="2025-10-02 11:13:03.362597302 +0000 UTC m=+1059.922504883" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.686507 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-wbhtf"] Oct 02 11:13:03 crc kubenswrapper[4835]: E1002 11:13:03.687012 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24483b5-1674-48c0-8330-1a96574d3460" containerName="init" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.687033 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24483b5-1674-48c0-8330-1a96574d3460" containerName="init" Oct 02 11:13:03 crc kubenswrapper[4835]: E1002 11:13:03.687057 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0859fb19-de1d-40d4-ab38-7e52e98d2391" containerName="init" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.687065 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0859fb19-de1d-40d4-ab38-7e52e98d2391" containerName="init" Oct 02 11:13:03 crc kubenswrapper[4835]: E1002 11:13:03.687090 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24483b5-1674-48c0-8330-1a96574d3460" containerName="dnsmasq-dns" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.687097 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24483b5-1674-48c0-8330-1a96574d3460" containerName="dnsmasq-dns" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.687314 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0859fb19-de1d-40d4-ab38-7e52e98d2391" containerName="init" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.687364 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24483b5-1674-48c0-8330-1a96574d3460" containerName="dnsmasq-dns" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.688083 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wbhtf" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.700117 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wbhtf"] Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.797134 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsccc\" (UniqueName: \"kubernetes.io/projected/a036d6a1-0e89-4e48-af67-ed924271e652-kube-api-access-zsccc\") pod \"keystone-db-create-wbhtf\" (UID: \"a036d6a1-0e89-4e48-af67-ed924271e652\") " pod="openstack/keystone-db-create-wbhtf" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.871862 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7qlnx"] Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.874077 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7qlnx" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.881115 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7qlnx"] Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.898866 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsccc\" (UniqueName: \"kubernetes.io/projected/a036d6a1-0e89-4e48-af67-ed924271e652-kube-api-access-zsccc\") pod \"keystone-db-create-wbhtf\" (UID: \"a036d6a1-0e89-4e48-af67-ed924271e652\") " pod="openstack/keystone-db-create-wbhtf" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.899038 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smr97\" (UniqueName: \"kubernetes.io/projected/feb816f8-4d3b-441d-acf1-7f0cf9828759-kube-api-access-smr97\") pod \"placement-db-create-7qlnx\" (UID: \"feb816f8-4d3b-441d-acf1-7f0cf9828759\") " pod="openstack/placement-db-create-7qlnx" Oct 02 11:13:03 crc kubenswrapper[4835]: I1002 11:13:03.918425 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsccc\" (UniqueName: \"kubernetes.io/projected/a036d6a1-0e89-4e48-af67-ed924271e652-kube-api-access-zsccc\") pod \"keystone-db-create-wbhtf\" (UID: \"a036d6a1-0e89-4e48-af67-ed924271e652\") " pod="openstack/keystone-db-create-wbhtf" Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.001138 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smr97\" (UniqueName: \"kubernetes.io/projected/feb816f8-4d3b-441d-acf1-7f0cf9828759-kube-api-access-smr97\") pod \"placement-db-create-7qlnx\" (UID: \"feb816f8-4d3b-441d-acf1-7f0cf9828759\") " pod="openstack/placement-db-create-7qlnx" Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.011074 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wbhtf" Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.019345 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smr97\" (UniqueName: \"kubernetes.io/projected/feb816f8-4d3b-441d-acf1-7f0cf9828759-kube-api-access-smr97\") pod \"placement-db-create-7qlnx\" (UID: \"feb816f8-4d3b-441d-acf1-7f0cf9828759\") " pod="openstack/placement-db-create-7qlnx" Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.161630 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8lvj9"] Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.162855 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8lvj9" Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.169665 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8lvj9"] Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.192436 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7qlnx" Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.206073 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnj2z\" (UniqueName: \"kubernetes.io/projected/6bc00dc1-89cb-437a-a52a-013dbede6dee-kube-api-access-rnj2z\") pod \"glance-db-create-8lvj9\" (UID: \"6bc00dc1-89cb-437a-a52a-013dbede6dee\") " pod="openstack/glance-db-create-8lvj9" Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.308397 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnj2z\" (UniqueName: \"kubernetes.io/projected/6bc00dc1-89cb-437a-a52a-013dbede6dee-kube-api-access-rnj2z\") pod \"glance-db-create-8lvj9\" (UID: \"6bc00dc1-89cb-437a-a52a-013dbede6dee\") " pod="openstack/glance-db-create-8lvj9" Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.329403 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnj2z\" (UniqueName: \"kubernetes.io/projected/6bc00dc1-89cb-437a-a52a-013dbede6dee-kube-api-access-rnj2z\") pod \"glance-db-create-8lvj9\" (UID: \"6bc00dc1-89cb-437a-a52a-013dbede6dee\") " pod="openstack/glance-db-create-8lvj9" Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.481535 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8lvj9" Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.483612 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wbhtf"] Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.618623 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7qlnx"] Oct 02 11:13:04 crc kubenswrapper[4835]: W1002 11:13:04.628732 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeb816f8_4d3b_441d_acf1_7f0cf9828759.slice/crio-4150616bb1a6b11221fef78b96f2e1e18e9453274b08e34307d402462b0bbfec WatchSource:0}: Error finding container 4150616bb1a6b11221fef78b96f2e1e18e9453274b08e34307d402462b0bbfec: Status 404 returned error can't find the container with id 4150616bb1a6b11221fef78b96f2e1e18e9453274b08e34307d402462b0bbfec Oct 02 11:13:04 crc kubenswrapper[4835]: I1002 11:13:04.918933 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8lvj9"] Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.341586 4835 generic.go:334] "Generic (PLEG): container finished" podID="a036d6a1-0e89-4e48-af67-ed924271e652" containerID="53bab9491bc4f1fa7117c649a765e9d3f854cd3ef6aa0b6951abd3a00bfc8533" exitCode=0 Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.341663 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wbhtf" event={"ID":"a036d6a1-0e89-4e48-af67-ed924271e652","Type":"ContainerDied","Data":"53bab9491bc4f1fa7117c649a765e9d3f854cd3ef6aa0b6951abd3a00bfc8533"} Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.341692 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wbhtf" event={"ID":"a036d6a1-0e89-4e48-af67-ed924271e652","Type":"ContainerStarted","Data":"3a39a8bf8f784aafe2d08c7ac432a819661f276e0777f712009e9cda233cd8c7"} Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.344603 4835 generic.go:334] "Generic (PLEG): container finished" podID="feb816f8-4d3b-441d-acf1-7f0cf9828759" containerID="7ce377a946ffb86e7ab11f161da89dcf78ee2c757911153e86d558cbd2130c09" exitCode=0 Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.344643 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7qlnx" event={"ID":"feb816f8-4d3b-441d-acf1-7f0cf9828759","Type":"ContainerDied","Data":"7ce377a946ffb86e7ab11f161da89dcf78ee2c757911153e86d558cbd2130c09"} Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.344679 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7qlnx" event={"ID":"feb816f8-4d3b-441d-acf1-7f0cf9828759","Type":"ContainerStarted","Data":"4150616bb1a6b11221fef78b96f2e1e18e9453274b08e34307d402462b0bbfec"} Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.346424 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8lvj9" event={"ID":"6bc00dc1-89cb-437a-a52a-013dbede6dee","Type":"ContainerStarted","Data":"9a76b78bc439967afd3c5b37ccdb5c88e396d3cdb1117df0c10bdd2fe3d81ee1"} Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.346454 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8lvj9" event={"ID":"6bc00dc1-89cb-437a-a52a-013dbede6dee","Type":"ContainerStarted","Data":"d479d4a23d19c0c5788af9baa93e6f239de8934972e2978fe01767b4dd9e58b5"} Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.384184 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-8lvj9" podStartSLOduration=1.384155739 podStartE2EDuration="1.384155739s" podCreationTimestamp="2025-10-02 11:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:05.383270884 +0000 UTC m=+1061.943178505" watchObservedRunningTime="2025-10-02 11:13:05.384155739 +0000 UTC m=+1061.944063330" Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.579268 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.857723 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.858120 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 02 11:13:05 crc kubenswrapper[4835]: I1002 11:13:05.918982 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 02 11:13:06 crc kubenswrapper[4835]: I1002 11:13:06.357850 4835 generic.go:334] "Generic (PLEG): container finished" podID="6bc00dc1-89cb-437a-a52a-013dbede6dee" containerID="9a76b78bc439967afd3c5b37ccdb5c88e396d3cdb1117df0c10bdd2fe3d81ee1" exitCode=0 Oct 02 11:13:06 crc kubenswrapper[4835]: I1002 11:13:06.357952 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8lvj9" event={"ID":"6bc00dc1-89cb-437a-a52a-013dbede6dee","Type":"ContainerDied","Data":"9a76b78bc439967afd3c5b37ccdb5c88e396d3cdb1117df0c10bdd2fe3d81ee1"} Oct 02 11:13:06 crc kubenswrapper[4835]: I1002 11:13:06.726213 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7qlnx" Oct 02 11:13:06 crc kubenswrapper[4835]: I1002 11:13:06.732002 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wbhtf" Oct 02 11:13:06 crc kubenswrapper[4835]: I1002 11:13:06.858548 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsccc\" (UniqueName: \"kubernetes.io/projected/a036d6a1-0e89-4e48-af67-ed924271e652-kube-api-access-zsccc\") pod \"a036d6a1-0e89-4e48-af67-ed924271e652\" (UID: \"a036d6a1-0e89-4e48-af67-ed924271e652\") " Oct 02 11:13:06 crc kubenswrapper[4835]: I1002 11:13:06.858651 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smr97\" (UniqueName: \"kubernetes.io/projected/feb816f8-4d3b-441d-acf1-7f0cf9828759-kube-api-access-smr97\") pod \"feb816f8-4d3b-441d-acf1-7f0cf9828759\" (UID: \"feb816f8-4d3b-441d-acf1-7f0cf9828759\") " Oct 02 11:13:06 crc kubenswrapper[4835]: I1002 11:13:06.866855 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a036d6a1-0e89-4e48-af67-ed924271e652-kube-api-access-zsccc" (OuterVolumeSpecName: "kube-api-access-zsccc") pod "a036d6a1-0e89-4e48-af67-ed924271e652" (UID: "a036d6a1-0e89-4e48-af67-ed924271e652"). InnerVolumeSpecName "kube-api-access-zsccc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:06 crc kubenswrapper[4835]: I1002 11:13:06.867305 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb816f8-4d3b-441d-acf1-7f0cf9828759-kube-api-access-smr97" (OuterVolumeSpecName: "kube-api-access-smr97") pod "feb816f8-4d3b-441d-acf1-7f0cf9828759" (UID: "feb816f8-4d3b-441d-acf1-7f0cf9828759"). InnerVolumeSpecName "kube-api-access-smr97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:06 crc kubenswrapper[4835]: I1002 11:13:06.961705 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsccc\" (UniqueName: \"kubernetes.io/projected/a036d6a1-0e89-4e48-af67-ed924271e652-kube-api-access-zsccc\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:06 crc kubenswrapper[4835]: I1002 11:13:06.961750 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smr97\" (UniqueName: \"kubernetes.io/projected/feb816f8-4d3b-441d-acf1-7f0cf9828759-kube-api-access-smr97\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:07 crc kubenswrapper[4835]: I1002 11:13:07.369846 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wbhtf" event={"ID":"a036d6a1-0e89-4e48-af67-ed924271e652","Type":"ContainerDied","Data":"3a39a8bf8f784aafe2d08c7ac432a819661f276e0777f712009e9cda233cd8c7"} Oct 02 11:13:07 crc kubenswrapper[4835]: I1002 11:13:07.369895 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a39a8bf8f784aafe2d08c7ac432a819661f276e0777f712009e9cda233cd8c7" Oct 02 11:13:07 crc kubenswrapper[4835]: I1002 11:13:07.369957 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wbhtf" Oct 02 11:13:07 crc kubenswrapper[4835]: I1002 11:13:07.373615 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7qlnx" Oct 02 11:13:07 crc kubenswrapper[4835]: I1002 11:13:07.373667 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7qlnx" event={"ID":"feb816f8-4d3b-441d-acf1-7f0cf9828759","Type":"ContainerDied","Data":"4150616bb1a6b11221fef78b96f2e1e18e9453274b08e34307d402462b0bbfec"} Oct 02 11:13:07 crc kubenswrapper[4835]: I1002 11:13:07.373697 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4150616bb1a6b11221fef78b96f2e1e18e9453274b08e34307d402462b0bbfec" Oct 02 11:13:07 crc kubenswrapper[4835]: I1002 11:13:07.611174 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8lvj9" Oct 02 11:13:07 crc kubenswrapper[4835]: I1002 11:13:07.774834 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnj2z\" (UniqueName: \"kubernetes.io/projected/6bc00dc1-89cb-437a-a52a-013dbede6dee-kube-api-access-rnj2z\") pod \"6bc00dc1-89cb-437a-a52a-013dbede6dee\" (UID: \"6bc00dc1-89cb-437a-a52a-013dbede6dee\") " Oct 02 11:13:07 crc kubenswrapper[4835]: I1002 11:13:07.781438 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc00dc1-89cb-437a-a52a-013dbede6dee-kube-api-access-rnj2z" (OuterVolumeSpecName: "kube-api-access-rnj2z") pod "6bc00dc1-89cb-437a-a52a-013dbede6dee" (UID: "6bc00dc1-89cb-437a-a52a-013dbede6dee"). InnerVolumeSpecName "kube-api-access-rnj2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:07 crc kubenswrapper[4835]: I1002 11:13:07.876841 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnj2z\" (UniqueName: \"kubernetes.io/projected/6bc00dc1-89cb-437a-a52a-013dbede6dee-kube-api-access-rnj2z\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:08 crc kubenswrapper[4835]: I1002 11:13:08.385774 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8lvj9" event={"ID":"6bc00dc1-89cb-437a-a52a-013dbede6dee","Type":"ContainerDied","Data":"d479d4a23d19c0c5788af9baa93e6f239de8934972e2978fe01767b4dd9e58b5"} Oct 02 11:13:08 crc kubenswrapper[4835]: I1002 11:13:08.385827 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d479d4a23d19c0c5788af9baa93e6f239de8934972e2978fe01767b4dd9e58b5" Oct 02 11:13:08 crc kubenswrapper[4835]: I1002 11:13:08.385853 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8lvj9" Oct 02 11:13:09 crc kubenswrapper[4835]: I1002 11:13:09.995989 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-2tk75" podUID="83798c14-4aa1-4530-82eb-fbe0cd6ceaf9" containerName="ovn-controller" probeResult="failure" output=< Oct 02 11:13:09 crc kubenswrapper[4835]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 11:13:09 crc kubenswrapper[4835]: > Oct 02 11:13:10 crc kubenswrapper[4835]: I1002 11:13:10.917406 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.114896 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:13:11 crc kubenswrapper[4835]: E1002 11:13:11.115467 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb816f8-4d3b-441d-acf1-7f0cf9828759" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.115489 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb816f8-4d3b-441d-acf1-7f0cf9828759" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4835]: E1002 11:13:11.115527 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc00dc1-89cb-437a-a52a-013dbede6dee" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.115535 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc00dc1-89cb-437a-a52a-013dbede6dee" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4835]: E1002 11:13:11.115554 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a036d6a1-0e89-4e48-af67-ed924271e652" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.115564 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a036d6a1-0e89-4e48-af67-ed924271e652" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.115789 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb816f8-4d3b-441d-acf1-7f0cf9828759" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.115811 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a036d6a1-0e89-4e48-af67-ed924271e652" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.115829 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc00dc1-89cb-437a-a52a-013dbede6dee" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.117049 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.119129 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-bck6p" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.119637 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.119688 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.126212 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.127182 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.266088 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/390fd0ef-bba6-458e-b1e6-121f3f846077-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.266135 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/390fd0ef-bba6-458e-b1e6-121f3f846077-scripts\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.266161 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390fd0ef-bba6-458e-b1e6-121f3f846077-config\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.266294 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390fd0ef-bba6-458e-b1e6-121f3f846077-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.266333 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/390fd0ef-bba6-458e-b1e6-121f3f846077-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.266358 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/390fd0ef-bba6-458e-b1e6-121f3f846077-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.266387 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcmh7\" (UniqueName: \"kubernetes.io/projected/390fd0ef-bba6-458e-b1e6-121f3f846077-kube-api-access-xcmh7\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.367497 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/390fd0ef-bba6-458e-b1e6-121f3f846077-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.367552 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/390fd0ef-bba6-458e-b1e6-121f3f846077-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.367588 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcmh7\" (UniqueName: \"kubernetes.io/projected/390fd0ef-bba6-458e-b1e6-121f3f846077-kube-api-access-xcmh7\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.367617 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/390fd0ef-bba6-458e-b1e6-121f3f846077-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.367644 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/390fd0ef-bba6-458e-b1e6-121f3f846077-scripts\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.367668 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390fd0ef-bba6-458e-b1e6-121f3f846077-config\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.367754 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390fd0ef-bba6-458e-b1e6-121f3f846077-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.368862 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/390fd0ef-bba6-458e-b1e6-121f3f846077-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.369428 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390fd0ef-bba6-458e-b1e6-121f3f846077-config\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.369562 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/390fd0ef-bba6-458e-b1e6-121f3f846077-scripts\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.377007 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/390fd0ef-bba6-458e-b1e6-121f3f846077-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.377169 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390fd0ef-bba6-458e-b1e6-121f3f846077-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.378701 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/390fd0ef-bba6-458e-b1e6-121f3f846077-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.384733 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcmh7\" (UniqueName: \"kubernetes.io/projected/390fd0ef-bba6-458e-b1e6-121f3f846077-kube-api-access-xcmh7\") pod \"ovn-northd-0\" (UID: \"390fd0ef-bba6-458e-b1e6-121f3f846077\") " pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.411639 4835 generic.go:334] "Generic (PLEG): container finished" podID="75581788-2dfd-41d9-8500-0b4e3d050cab" containerID="37458b76b2ffa63de5c4a5fa2995e72d004e9c5d9c332ebf588cdfc9661d408e" exitCode=0 Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.411832 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75581788-2dfd-41d9-8500-0b4e3d050cab","Type":"ContainerDied","Data":"37458b76b2ffa63de5c4a5fa2995e72d004e9c5d9c332ebf588cdfc9661d408e"} Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.414192 4835 generic.go:334] "Generic (PLEG): container finished" podID="6bc536a4-ef50-4d6d-aca5-a030ce38ce24" containerID="0ef1712aa04a0792cf4b9650430c005225c98aa32da76b73bd349ff12c229cbd" exitCode=0 Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.414244 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6bc536a4-ef50-4d6d-aca5-a030ce38ce24","Type":"ContainerDied","Data":"0ef1712aa04a0792cf4b9650430c005225c98aa32da76b73bd349ff12c229cbd"} Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.448151 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.947081 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:13:11 crc kubenswrapper[4835]: W1002 11:13:11.949660 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod390fd0ef_bba6_458e_b1e6_121f3f846077.slice/crio-059baf3f6b467831d39f131cd0218236cf92da4d62ebb32b816b49abc4dbe7f2 WatchSource:0}: Error finding container 059baf3f6b467831d39f131cd0218236cf92da4d62ebb32b816b49abc4dbe7f2: Status 404 returned error can't find the container with id 059baf3f6b467831d39f131cd0218236cf92da4d62ebb32b816b49abc4dbe7f2 Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.986519 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.986590 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.986638 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.987388 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d0ce126b4147f93bbd26e8b66aa4ae542cae2002f8fd305bd26ccf1276aba52"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:13:11 crc kubenswrapper[4835]: I1002 11:13:11.987444 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://2d0ce126b4147f93bbd26e8b66aa4ae542cae2002f8fd305bd26ccf1276aba52" gracePeriod=600 Oct 02 11:13:12 crc kubenswrapper[4835]: I1002 11:13:12.427145 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="2d0ce126b4147f93bbd26e8b66aa4ae542cae2002f8fd305bd26ccf1276aba52" exitCode=0 Oct 02 11:13:12 crc kubenswrapper[4835]: I1002 11:13:12.427193 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"2d0ce126b4147f93bbd26e8b66aa4ae542cae2002f8fd305bd26ccf1276aba52"} Oct 02 11:13:12 crc kubenswrapper[4835]: I1002 11:13:12.427603 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"ece753eadbbcedfb37995a0897d789fdf5c6660566a042ca4a738c29f1789c89"} Oct 02 11:13:12 crc kubenswrapper[4835]: I1002 11:13:12.427625 4835 scope.go:117] "RemoveContainer" containerID="9482e972b371878a44442b6006c3a59a025e351c1ec2a25e635bbcea7c81f32b" Oct 02 11:13:12 crc kubenswrapper[4835]: I1002 11:13:12.433126 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75581788-2dfd-41d9-8500-0b4e3d050cab","Type":"ContainerStarted","Data":"c7d6fcf1360156f5f0d76fb0f5cdea23c360623fbf19f383a3934656ea10dc9b"} Oct 02 11:13:12 crc kubenswrapper[4835]: I1002 11:13:12.433407 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 11:13:12 crc kubenswrapper[4835]: I1002 11:13:12.435655 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6bc536a4-ef50-4d6d-aca5-a030ce38ce24","Type":"ContainerStarted","Data":"a40271397664aed96ba97530aa140906e693827bd323db569c29450f0806063b"} Oct 02 11:13:12 crc kubenswrapper[4835]: I1002 11:13:12.435873 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:13:12 crc kubenswrapper[4835]: I1002 11:13:12.437241 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"390fd0ef-bba6-458e-b1e6-121f3f846077","Type":"ContainerStarted","Data":"059baf3f6b467831d39f131cd0218236cf92da4d62ebb32b816b49abc4dbe7f2"} Oct 02 11:13:12 crc kubenswrapper[4835]: I1002 11:13:12.504161 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.33312757 podStartE2EDuration="1m3.50413092s" podCreationTimestamp="2025-10-02 11:12:09 +0000 UTC" firstStartedPulling="2025-10-02 11:12:24.323862858 +0000 UTC m=+1020.883770439" lastFinishedPulling="2025-10-02 11:12:36.494866218 +0000 UTC m=+1033.054773789" observedRunningTime="2025-10-02 11:13:12.496384708 +0000 UTC m=+1069.056292319" watchObservedRunningTime="2025-10-02 11:13:12.50413092 +0000 UTC m=+1069.064038501" Oct 02 11:13:12 crc kubenswrapper[4835]: I1002 11:13:12.550901 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.383511242 podStartE2EDuration="1m3.550879258s" podCreationTimestamp="2025-10-02 11:12:09 +0000 UTC" firstStartedPulling="2025-10-02 11:12:23.943202263 +0000 UTC m=+1020.503109844" lastFinishedPulling="2025-10-02 11:12:37.110570269 +0000 UTC m=+1033.670477860" observedRunningTime="2025-10-02 11:13:12.52578552 +0000 UTC m=+1069.085693111" watchObservedRunningTime="2025-10-02 11:13:12.550879258 +0000 UTC m=+1069.110786859" Oct 02 11:13:13 crc kubenswrapper[4835]: I1002 11:13:13.748194 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1832-account-create-8gktd"] Oct 02 11:13:13 crc kubenswrapper[4835]: I1002 11:13:13.750629 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1832-account-create-8gktd" Oct 02 11:13:13 crc kubenswrapper[4835]: I1002 11:13:13.753381 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 02 11:13:13 crc kubenswrapper[4835]: I1002 11:13:13.785481 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1832-account-create-8gktd"] Oct 02 11:13:13 crc kubenswrapper[4835]: I1002 11:13:13.825020 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzgkn\" (UniqueName: \"kubernetes.io/projected/cfe3b0b0-65db-451b-afc6-af7628c749a2-kube-api-access-xzgkn\") pod \"keystone-1832-account-create-8gktd\" (UID: \"cfe3b0b0-65db-451b-afc6-af7628c749a2\") " pod="openstack/keystone-1832-account-create-8gktd" Oct 02 11:13:13 crc kubenswrapper[4835]: I1002 11:13:13.928365 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzgkn\" (UniqueName: \"kubernetes.io/projected/cfe3b0b0-65db-451b-afc6-af7628c749a2-kube-api-access-xzgkn\") pod \"keystone-1832-account-create-8gktd\" (UID: \"cfe3b0b0-65db-451b-afc6-af7628c749a2\") " pod="openstack/keystone-1832-account-create-8gktd" Oct 02 11:13:13 crc kubenswrapper[4835]: I1002 11:13:13.930383 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-af88-account-create-7tv6x"] Oct 02 11:13:13 crc kubenswrapper[4835]: I1002 11:13:13.931673 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-af88-account-create-7tv6x" Oct 02 11:13:13 crc kubenswrapper[4835]: I1002 11:13:13.934795 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 02 11:13:13 crc kubenswrapper[4835]: I1002 11:13:13.956849 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzgkn\" (UniqueName: \"kubernetes.io/projected/cfe3b0b0-65db-451b-afc6-af7628c749a2-kube-api-access-xzgkn\") pod \"keystone-1832-account-create-8gktd\" (UID: \"cfe3b0b0-65db-451b-afc6-af7628c749a2\") " pod="openstack/keystone-1832-account-create-8gktd" Oct 02 11:13:13 crc kubenswrapper[4835]: I1002 11:13:13.966609 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-af88-account-create-7tv6x"] Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.030591 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggc9n\" (UniqueName: \"kubernetes.io/projected/1ce56b89-22d8-42bc-badc-0da2fd73cb25-kube-api-access-ggc9n\") pod \"placement-af88-account-create-7tv6x\" (UID: \"1ce56b89-22d8-42bc-badc-0da2fd73cb25\") " pod="openstack/placement-af88-account-create-7tv6x" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.076207 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1832-account-create-8gktd" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.132993 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggc9n\" (UniqueName: \"kubernetes.io/projected/1ce56b89-22d8-42bc-badc-0da2fd73cb25-kube-api-access-ggc9n\") pod \"placement-af88-account-create-7tv6x\" (UID: \"1ce56b89-22d8-42bc-badc-0da2fd73cb25\") " pod="openstack/placement-af88-account-create-7tv6x" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.157801 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggc9n\" (UniqueName: \"kubernetes.io/projected/1ce56b89-22d8-42bc-badc-0da2fd73cb25-kube-api-access-ggc9n\") pod \"placement-af88-account-create-7tv6x\" (UID: \"1ce56b89-22d8-42bc-badc-0da2fd73cb25\") " pod="openstack/placement-af88-account-create-7tv6x" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.270066 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-af88-account-create-7tv6x" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.311119 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-341d-account-create-vdzq8"] Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.314998 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-341d-account-create-vdzq8" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.318616 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.338087 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-341d-account-create-vdzq8"] Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.441845 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6mhd\" (UniqueName: \"kubernetes.io/projected/38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c-kube-api-access-m6mhd\") pod \"glance-341d-account-create-vdzq8\" (UID: \"38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c\") " pod="openstack/glance-341d-account-create-vdzq8" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.469320 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"390fd0ef-bba6-458e-b1e6-121f3f846077","Type":"ContainerStarted","Data":"0b80b9a47df37d851b5cb2d2a03b39aa5c3541a583199e1f8f8add6c804568fb"} Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.469375 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"390fd0ef-bba6-458e-b1e6-121f3f846077","Type":"ContainerStarted","Data":"4a0ff8ec5dbdf2f931a1b89f9f4cb71559c884f119e3288174c8b970511d1fa1"} Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.469626 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.494847 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.871627637 podStartE2EDuration="3.494828362s" podCreationTimestamp="2025-10-02 11:13:11 +0000 UTC" firstStartedPulling="2025-10-02 11:13:11.951950186 +0000 UTC m=+1068.511857767" lastFinishedPulling="2025-10-02 11:13:13.575150911 +0000 UTC m=+1070.135058492" observedRunningTime="2025-10-02 11:13:14.48706933 +0000 UTC m=+1071.046976931" watchObservedRunningTime="2025-10-02 11:13:14.494828362 +0000 UTC m=+1071.054735943" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.543755 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6mhd\" (UniqueName: \"kubernetes.io/projected/38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c-kube-api-access-m6mhd\") pod \"glance-341d-account-create-vdzq8\" (UID: \"38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c\") " pod="openstack/glance-341d-account-create-vdzq8" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.570444 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1832-account-create-8gktd"] Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.578871 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6mhd\" (UniqueName: \"kubernetes.io/projected/38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c-kube-api-access-m6mhd\") pod \"glance-341d-account-create-vdzq8\" (UID: \"38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c\") " pod="openstack/glance-341d-account-create-vdzq8" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.645816 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-341d-account-create-vdzq8" Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.842690 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-af88-account-create-7tv6x"] Oct 02 11:13:14 crc kubenswrapper[4835]: W1002 11:13:14.871808 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ce56b89_22d8_42bc_badc_0da2fd73cb25.slice/crio-a7b196c165a728227e8f7b8d58509bbc93f875c7db82d0a00fc8b923148e6c7c WatchSource:0}: Error finding container a7b196c165a728227e8f7b8d58509bbc93f875c7db82d0a00fc8b923148e6c7c: Status 404 returned error can't find the container with id a7b196c165a728227e8f7b8d58509bbc93f875c7db82d0a00fc8b923148e6c7c Oct 02 11:13:14 crc kubenswrapper[4835]: I1002 11:13:14.893813 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-341d-account-create-vdzq8"] Oct 02 11:13:14 crc kubenswrapper[4835]: W1002 11:13:14.900601 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38b8ad1d_8591_4ee4_8bfd_1d40e7e6158c.slice/crio-57f0a910460f02a0f60dbf242d0135a8d3813f2df3b6277c0625168009445e8e WatchSource:0}: Error finding container 57f0a910460f02a0f60dbf242d0135a8d3813f2df3b6277c0625168009445e8e: Status 404 returned error can't find the container with id 57f0a910460f02a0f60dbf242d0135a8d3813f2df3b6277c0625168009445e8e Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.024424 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-2tk75" podUID="83798c14-4aa1-4530-82eb-fbe0cd6ceaf9" containerName="ovn-controller" probeResult="failure" output=< Oct 02 11:13:15 crc kubenswrapper[4835]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 11:13:15 crc kubenswrapper[4835]: > Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.031013 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.085542 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4bgdg" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.295194 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2tk75-config-n489w"] Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.296956 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.298970 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.315170 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2tk75-config-n489w"] Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.383080 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run-ovn\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.383167 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-scripts\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.383263 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-additional-scripts\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.383297 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.383321 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-log-ovn\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.383338 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhlg6\" (UniqueName: \"kubernetes.io/projected/a4b46902-d519-4483-bd08-b6c1060d2e5b-kube-api-access-qhlg6\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.478989 4835 generic.go:334] "Generic (PLEG): container finished" podID="1ce56b89-22d8-42bc-badc-0da2fd73cb25" containerID="fc21480867b5920158b6ec75ccf76f6a84da3a714e12b8efa940625c8351f855" exitCode=0 Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.479082 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-af88-account-create-7tv6x" event={"ID":"1ce56b89-22d8-42bc-badc-0da2fd73cb25","Type":"ContainerDied","Data":"fc21480867b5920158b6ec75ccf76f6a84da3a714e12b8efa940625c8351f855"} Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.479115 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-af88-account-create-7tv6x" event={"ID":"1ce56b89-22d8-42bc-badc-0da2fd73cb25","Type":"ContainerStarted","Data":"a7b196c165a728227e8f7b8d58509bbc93f875c7db82d0a00fc8b923148e6c7c"} Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.481360 4835 generic.go:334] "Generic (PLEG): container finished" podID="cfe3b0b0-65db-451b-afc6-af7628c749a2" containerID="e1d614c7ea85b6f828fe5a6237381a41ab8606912dd9d7f1b3bd9abadf312385" exitCode=0 Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.481447 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1832-account-create-8gktd" event={"ID":"cfe3b0b0-65db-451b-afc6-af7628c749a2","Type":"ContainerDied","Data":"e1d614c7ea85b6f828fe5a6237381a41ab8606912dd9d7f1b3bd9abadf312385"} Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.481487 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1832-account-create-8gktd" event={"ID":"cfe3b0b0-65db-451b-afc6-af7628c749a2","Type":"ContainerStarted","Data":"2ce45f2c78b3f75a32b50cdb89f8cbf3a4c5e9aaa3d4fcb18b9e832f0de09d7d"} Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.483472 4835 generic.go:334] "Generic (PLEG): container finished" podID="38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c" containerID="fd42a9561cca9b64ac5d58da62b0197412c8da4160052b49fd6777a827f054e6" exitCode=0 Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.483564 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-341d-account-create-vdzq8" event={"ID":"38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c","Type":"ContainerDied","Data":"fd42a9561cca9b64ac5d58da62b0197412c8da4160052b49fd6777a827f054e6"} Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.483601 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-341d-account-create-vdzq8" event={"ID":"38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c","Type":"ContainerStarted","Data":"57f0a910460f02a0f60dbf242d0135a8d3813f2df3b6277c0625168009445e8e"} Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.484642 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run-ovn\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.484846 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-scripts\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.484987 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run-ovn\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.485000 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-additional-scripts\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.485087 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.485130 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-log-ovn\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.485169 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhlg6\" (UniqueName: \"kubernetes.io/projected/a4b46902-d519-4483-bd08-b6c1060d2e5b-kube-api-access-qhlg6\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.485384 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.485426 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-log-ovn\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.485820 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-additional-scripts\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.487146 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-scripts\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.511630 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhlg6\" (UniqueName: \"kubernetes.io/projected/a4b46902-d519-4483-bd08-b6c1060d2e5b-kube-api-access-qhlg6\") pod \"ovn-controller-2tk75-config-n489w\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:15 crc kubenswrapper[4835]: I1002 11:13:15.610789 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:16 crc kubenswrapper[4835]: I1002 11:13:16.046255 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2tk75-config-n489w"] Oct 02 11:13:16 crc kubenswrapper[4835]: I1002 11:13:16.494023 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2tk75-config-n489w" event={"ID":"a4b46902-d519-4483-bd08-b6c1060d2e5b","Type":"ContainerStarted","Data":"dda36427274bde044185c3e10000bd1e81dd71ad32c6fd08b776a774e9aa5cce"} Oct 02 11:13:16 crc kubenswrapper[4835]: I1002 11:13:16.494405 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2tk75-config-n489w" event={"ID":"a4b46902-d519-4483-bd08-b6c1060d2e5b","Type":"ContainerStarted","Data":"a9b34b67484074665faff0ba3050e2c151d859280b96419f342b03ed314ebde0"} Oct 02 11:13:16 crc kubenswrapper[4835]: I1002 11:13:16.529856 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2tk75-config-n489w" podStartSLOduration=1.5298301840000001 podStartE2EDuration="1.529830184s" podCreationTimestamp="2025-10-02 11:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:16.518895051 +0000 UTC m=+1073.078802622" watchObservedRunningTime="2025-10-02 11:13:16.529830184 +0000 UTC m=+1073.089737775" Oct 02 11:13:16 crc kubenswrapper[4835]: I1002 11:13:16.865927 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-341d-account-create-vdzq8" Oct 02 11:13:16 crc kubenswrapper[4835]: I1002 11:13:16.964399 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1832-account-create-8gktd" Oct 02 11:13:16 crc kubenswrapper[4835]: I1002 11:13:16.973969 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-af88-account-create-7tv6x" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.015146 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6mhd\" (UniqueName: \"kubernetes.io/projected/38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c-kube-api-access-m6mhd\") pod \"38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c\" (UID: \"38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c\") " Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.023882 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c-kube-api-access-m6mhd" (OuterVolumeSpecName: "kube-api-access-m6mhd") pod "38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c" (UID: "38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c"). InnerVolumeSpecName "kube-api-access-m6mhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.117509 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzgkn\" (UniqueName: \"kubernetes.io/projected/cfe3b0b0-65db-451b-afc6-af7628c749a2-kube-api-access-xzgkn\") pod \"cfe3b0b0-65db-451b-afc6-af7628c749a2\" (UID: \"cfe3b0b0-65db-451b-afc6-af7628c749a2\") " Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.118193 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggc9n\" (UniqueName: \"kubernetes.io/projected/1ce56b89-22d8-42bc-badc-0da2fd73cb25-kube-api-access-ggc9n\") pod \"1ce56b89-22d8-42bc-badc-0da2fd73cb25\" (UID: \"1ce56b89-22d8-42bc-badc-0da2fd73cb25\") " Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.118720 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6mhd\" (UniqueName: \"kubernetes.io/projected/38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c-kube-api-access-m6mhd\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.121614 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce56b89-22d8-42bc-badc-0da2fd73cb25-kube-api-access-ggc9n" (OuterVolumeSpecName: "kube-api-access-ggc9n") pod "1ce56b89-22d8-42bc-badc-0da2fd73cb25" (UID: "1ce56b89-22d8-42bc-badc-0da2fd73cb25"). InnerVolumeSpecName "kube-api-access-ggc9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.121680 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe3b0b0-65db-451b-afc6-af7628c749a2-kube-api-access-xzgkn" (OuterVolumeSpecName: "kube-api-access-xzgkn") pod "cfe3b0b0-65db-451b-afc6-af7628c749a2" (UID: "cfe3b0b0-65db-451b-afc6-af7628c749a2"). InnerVolumeSpecName "kube-api-access-xzgkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.220409 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzgkn\" (UniqueName: \"kubernetes.io/projected/cfe3b0b0-65db-451b-afc6-af7628c749a2-kube-api-access-xzgkn\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.220457 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggc9n\" (UniqueName: \"kubernetes.io/projected/1ce56b89-22d8-42bc-badc-0da2fd73cb25-kube-api-access-ggc9n\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.504834 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-341d-account-create-vdzq8" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.504832 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-341d-account-create-vdzq8" event={"ID":"38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c","Type":"ContainerDied","Data":"57f0a910460f02a0f60dbf242d0135a8d3813f2df3b6277c0625168009445e8e"} Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.505074 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57f0a910460f02a0f60dbf242d0135a8d3813f2df3b6277c0625168009445e8e" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.506352 4835 generic.go:334] "Generic (PLEG): container finished" podID="a4b46902-d519-4483-bd08-b6c1060d2e5b" containerID="dda36427274bde044185c3e10000bd1e81dd71ad32c6fd08b776a774e9aa5cce" exitCode=0 Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.506446 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2tk75-config-n489w" event={"ID":"a4b46902-d519-4483-bd08-b6c1060d2e5b","Type":"ContainerDied","Data":"dda36427274bde044185c3e10000bd1e81dd71ad32c6fd08b776a774e9aa5cce"} Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.507891 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-af88-account-create-7tv6x" event={"ID":"1ce56b89-22d8-42bc-badc-0da2fd73cb25","Type":"ContainerDied","Data":"a7b196c165a728227e8f7b8d58509bbc93f875c7db82d0a00fc8b923148e6c7c"} Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.508033 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7b196c165a728227e8f7b8d58509bbc93f875c7db82d0a00fc8b923148e6c7c" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.507902 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-af88-account-create-7tv6x" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.509477 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1832-account-create-8gktd" event={"ID":"cfe3b0b0-65db-451b-afc6-af7628c749a2","Type":"ContainerDied","Data":"2ce45f2c78b3f75a32b50cdb89f8cbf3a4c5e9aaa3d4fcb18b9e832f0de09d7d"} Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.509513 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ce45f2c78b3f75a32b50cdb89f8cbf3a4c5e9aaa3d4fcb18b9e832f0de09d7d" Oct 02 11:13:17 crc kubenswrapper[4835]: I1002 11:13:17.509525 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1832-account-create-8gktd" Oct 02 11:13:18 crc kubenswrapper[4835]: I1002 11:13:18.903856 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.087474 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-scripts\") pod \"a4b46902-d519-4483-bd08-b6c1060d2e5b\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.087533 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-log-ovn\") pod \"a4b46902-d519-4483-bd08-b6c1060d2e5b\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.087638 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run\") pod \"a4b46902-d519-4483-bd08-b6c1060d2e5b\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.087693 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run-ovn\") pod \"a4b46902-d519-4483-bd08-b6c1060d2e5b\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.087743 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-additional-scripts\") pod \"a4b46902-d519-4483-bd08-b6c1060d2e5b\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.087839 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhlg6\" (UniqueName: \"kubernetes.io/projected/a4b46902-d519-4483-bd08-b6c1060d2e5b-kube-api-access-qhlg6\") pod \"a4b46902-d519-4483-bd08-b6c1060d2e5b\" (UID: \"a4b46902-d519-4483-bd08-b6c1060d2e5b\") " Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.088109 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run" (OuterVolumeSpecName: "var-run") pod "a4b46902-d519-4483-bd08-b6c1060d2e5b" (UID: "a4b46902-d519-4483-bd08-b6c1060d2e5b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.088153 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a4b46902-d519-4483-bd08-b6c1060d2e5b" (UID: "a4b46902-d519-4483-bd08-b6c1060d2e5b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.088280 4835 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.088297 4835 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.088759 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a4b46902-d519-4483-bd08-b6c1060d2e5b" (UID: "a4b46902-d519-4483-bd08-b6c1060d2e5b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.088797 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a4b46902-d519-4483-bd08-b6c1060d2e5b" (UID: "a4b46902-d519-4483-bd08-b6c1060d2e5b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.088954 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-scripts" (OuterVolumeSpecName: "scripts") pod "a4b46902-d519-4483-bd08-b6c1060d2e5b" (UID: "a4b46902-d519-4483-bd08-b6c1060d2e5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.116062 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b46902-d519-4483-bd08-b6c1060d2e5b-kube-api-access-qhlg6" (OuterVolumeSpecName: "kube-api-access-qhlg6") pod "a4b46902-d519-4483-bd08-b6c1060d2e5b" (UID: "a4b46902-d519-4483-bd08-b6c1060d2e5b"). InnerVolumeSpecName "kube-api-access-qhlg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.189498 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhlg6\" (UniqueName: \"kubernetes.io/projected/a4b46902-d519-4483-bd08-b6c1060d2e5b-kube-api-access-qhlg6\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.189532 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.189542 4835 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4b46902-d519-4483-bd08-b6c1060d2e5b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.189551 4835 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a4b46902-d519-4483-bd08-b6c1060d2e5b-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.499621 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-7zkdc"] Oct 02 11:13:19 crc kubenswrapper[4835]: E1002 11:13:19.499950 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b46902-d519-4483-bd08-b6c1060d2e5b" containerName="ovn-config" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.499966 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b46902-d519-4483-bd08-b6c1060d2e5b" containerName="ovn-config" Oct 02 11:13:19 crc kubenswrapper[4835]: E1002 11:13:19.499988 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe3b0b0-65db-451b-afc6-af7628c749a2" containerName="mariadb-account-create" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.499995 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe3b0b0-65db-451b-afc6-af7628c749a2" containerName="mariadb-account-create" Oct 02 11:13:19 crc kubenswrapper[4835]: E1002 11:13:19.500007 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce56b89-22d8-42bc-badc-0da2fd73cb25" containerName="mariadb-account-create" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.500012 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce56b89-22d8-42bc-badc-0da2fd73cb25" containerName="mariadb-account-create" Oct 02 11:13:19 crc kubenswrapper[4835]: E1002 11:13:19.500036 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c" containerName="mariadb-account-create" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.500042 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c" containerName="mariadb-account-create" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.500184 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe3b0b0-65db-451b-afc6-af7628c749a2" containerName="mariadb-account-create" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.500201 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce56b89-22d8-42bc-badc-0da2fd73cb25" containerName="mariadb-account-create" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.500212 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c" containerName="mariadb-account-create" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.500292 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b46902-d519-4483-bd08-b6c1060d2e5b" containerName="ovn-config" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.500836 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.503131 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rzbh7" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.507833 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.517342 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7zkdc"] Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.524931 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2tk75-config-n489w" event={"ID":"a4b46902-d519-4483-bd08-b6c1060d2e5b","Type":"ContainerDied","Data":"a9b34b67484074665faff0ba3050e2c151d859280b96419f342b03ed314ebde0"} Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.524981 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b34b67484074665faff0ba3050e2c151d859280b96419f342b03ed314ebde0" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.525051 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2tk75-config-n489w" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.675897 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2tk75-config-n489w"] Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.684550 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2tk75-config-n489w"] Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.697474 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2wvl\" (UniqueName: \"kubernetes.io/projected/d9246fcb-4999-4c04-8d46-729b57a896ef-kube-api-access-k2wvl\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.697549 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-combined-ca-bundle\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.697654 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-config-data\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.697727 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-db-sync-config-data\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.799422 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-config-data\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.800034 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-db-sync-config-data\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.800310 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2wvl\" (UniqueName: \"kubernetes.io/projected/d9246fcb-4999-4c04-8d46-729b57a896ef-kube-api-access-k2wvl\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.800484 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-combined-ca-bundle\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.804951 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-db-sync-config-data\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.804990 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-combined-ca-bundle\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.805025 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-config-data\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.851274 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2tk75-config-hd7rd"] Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.852592 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.855213 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.860434 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2wvl\" (UniqueName: \"kubernetes.io/projected/d9246fcb-4999-4c04-8d46-729b57a896ef-kube-api-access-k2wvl\") pod \"glance-db-sync-7zkdc\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:19 crc kubenswrapper[4835]: I1002 11:13:19.871601 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2tk75-config-hd7rd"] Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.003477 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-2tk75" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.005762 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run-ovn\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.006125 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-log-ovn\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.006173 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.006485 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-scripts\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.006533 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4c8g\" (UniqueName: \"kubernetes.io/projected/493f4d8e-1585-4f10-bc9a-f36133834bf4-kube-api-access-c4c8g\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.006569 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-additional-scripts\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.108381 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-scripts\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.108445 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4c8g\" (UniqueName: \"kubernetes.io/projected/493f4d8e-1585-4f10-bc9a-f36133834bf4-kube-api-access-c4c8g\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.108515 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-additional-scripts\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.108645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run-ovn\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.108715 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-log-ovn\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.108771 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.109264 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.109421 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-log-ovn\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.109491 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run-ovn\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.109608 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-additional-scripts\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.120047 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-scripts\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.121400 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.142664 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4c8g\" (UniqueName: \"kubernetes.io/projected/493f4d8e-1585-4f10-bc9a-f36133834bf4-kube-api-access-c4c8g\") pod \"ovn-controller-2tk75-config-hd7rd\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.227644 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.269528 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b46902-d519-4483-bd08-b6c1060d2e5b" path="/var/lib/kubelet/pods/a4b46902-d519-4483-bd08-b6c1060d2e5b/volumes" Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.672795 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7zkdc"] Oct 02 11:13:20 crc kubenswrapper[4835]: W1002 11:13:20.676331 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9246fcb_4999_4c04_8d46_729b57a896ef.slice/crio-a244507335f3b7d12ee58576c98f84f3061f6ee5222dc84e8aa5cb5e3dbf9598 WatchSource:0}: Error finding container a244507335f3b7d12ee58576c98f84f3061f6ee5222dc84e8aa5cb5e3dbf9598: Status 404 returned error can't find the container with id a244507335f3b7d12ee58576c98f84f3061f6ee5222dc84e8aa5cb5e3dbf9598 Oct 02 11:13:20 crc kubenswrapper[4835]: W1002 11:13:20.804383 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493f4d8e_1585_4f10_bc9a_f36133834bf4.slice/crio-23b11da00aee483b09908bcdebd40b1aa6116e953fc91f20db3a03f3377217da WatchSource:0}: Error finding container 23b11da00aee483b09908bcdebd40b1aa6116e953fc91f20db3a03f3377217da: Status 404 returned error can't find the container with id 23b11da00aee483b09908bcdebd40b1aa6116e953fc91f20db3a03f3377217da Oct 02 11:13:20 crc kubenswrapper[4835]: I1002 11:13:20.806532 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2tk75-config-hd7rd"] Oct 02 11:13:21 crc kubenswrapper[4835]: I1002 11:13:21.053179 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="6bc536a4-ef50-4d6d-aca5-a030ce38ce24" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Oct 02 11:13:21 crc kubenswrapper[4835]: I1002 11:13:21.553606 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7zkdc" event={"ID":"d9246fcb-4999-4c04-8d46-729b57a896ef","Type":"ContainerStarted","Data":"a244507335f3b7d12ee58576c98f84f3061f6ee5222dc84e8aa5cb5e3dbf9598"} Oct 02 11:13:21 crc kubenswrapper[4835]: I1002 11:13:21.557233 4835 generic.go:334] "Generic (PLEG): container finished" podID="493f4d8e-1585-4f10-bc9a-f36133834bf4" containerID="bbc857a30cd84653eda4005289281b2b9f1c274616a77d8554e672d9b98bb02d" exitCode=0 Oct 02 11:13:21 crc kubenswrapper[4835]: I1002 11:13:21.557298 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2tk75-config-hd7rd" event={"ID":"493f4d8e-1585-4f10-bc9a-f36133834bf4","Type":"ContainerDied","Data":"bbc857a30cd84653eda4005289281b2b9f1c274616a77d8554e672d9b98bb02d"} Oct 02 11:13:21 crc kubenswrapper[4835]: I1002 11:13:21.557348 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2tk75-config-hd7rd" event={"ID":"493f4d8e-1585-4f10-bc9a-f36133834bf4","Type":"ContainerStarted","Data":"23b11da00aee483b09908bcdebd40b1aa6116e953fc91f20db3a03f3377217da"} Oct 02 11:13:22 crc kubenswrapper[4835]: I1002 11:13:22.875635 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066013 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run-ovn\") pod \"493f4d8e-1585-4f10-bc9a-f36133834bf4\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066107 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-scripts\") pod \"493f4d8e-1585-4f10-bc9a-f36133834bf4\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066159 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-log-ovn\") pod \"493f4d8e-1585-4f10-bc9a-f36133834bf4\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066097 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "493f4d8e-1585-4f10-bc9a-f36133834bf4" (UID: "493f4d8e-1585-4f10-bc9a-f36133834bf4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066214 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4c8g\" (UniqueName: \"kubernetes.io/projected/493f4d8e-1585-4f10-bc9a-f36133834bf4-kube-api-access-c4c8g\") pod \"493f4d8e-1585-4f10-bc9a-f36133834bf4\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066278 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run\") pod \"493f4d8e-1585-4f10-bc9a-f36133834bf4\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066304 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "493f4d8e-1585-4f10-bc9a-f36133834bf4" (UID: "493f4d8e-1585-4f10-bc9a-f36133834bf4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066342 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-additional-scripts\") pod \"493f4d8e-1585-4f10-bc9a-f36133834bf4\" (UID: \"493f4d8e-1585-4f10-bc9a-f36133834bf4\") " Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066370 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run" (OuterVolumeSpecName: "var-run") pod "493f4d8e-1585-4f10-bc9a-f36133834bf4" (UID: "493f4d8e-1585-4f10-bc9a-f36133834bf4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066673 4835 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066692 4835 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.066700 4835 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/493f4d8e-1585-4f10-bc9a-f36133834bf4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.067121 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "493f4d8e-1585-4f10-bc9a-f36133834bf4" (UID: "493f4d8e-1585-4f10-bc9a-f36133834bf4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.067398 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-scripts" (OuterVolumeSpecName: "scripts") pod "493f4d8e-1585-4f10-bc9a-f36133834bf4" (UID: "493f4d8e-1585-4f10-bc9a-f36133834bf4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.091128 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/493f4d8e-1585-4f10-bc9a-f36133834bf4-kube-api-access-c4c8g" (OuterVolumeSpecName: "kube-api-access-c4c8g") pod "493f4d8e-1585-4f10-bc9a-f36133834bf4" (UID: "493f4d8e-1585-4f10-bc9a-f36133834bf4"). InnerVolumeSpecName "kube-api-access-c4c8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.168977 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.169017 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4c8g\" (UniqueName: \"kubernetes.io/projected/493f4d8e-1585-4f10-bc9a-f36133834bf4-kube-api-access-c4c8g\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.169035 4835 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/493f4d8e-1585-4f10-bc9a-f36133834bf4-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.576607 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2tk75-config-hd7rd" event={"ID":"493f4d8e-1585-4f10-bc9a-f36133834bf4","Type":"ContainerDied","Data":"23b11da00aee483b09908bcdebd40b1aa6116e953fc91f20db3a03f3377217da"} Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.576670 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b11da00aee483b09908bcdebd40b1aa6116e953fc91f20db3a03f3377217da" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.576640 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2tk75-config-hd7rd" Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.947194 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2tk75-config-hd7rd"] Oct 02 11:13:23 crc kubenswrapper[4835]: I1002 11:13:23.955941 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2tk75-config-hd7rd"] Oct 02 11:13:24 crc kubenswrapper[4835]: I1002 11:13:24.267437 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="493f4d8e-1585-4f10-bc9a-f36133834bf4" path="/var/lib/kubelet/pods/493f4d8e-1585-4f10-bc9a-f36133834bf4/volumes" Oct 02 11:13:26 crc kubenswrapper[4835]: I1002 11:13:26.508885 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 02 11:13:30 crc kubenswrapper[4835]: I1002 11:13:30.761158 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.040749 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-bsb2r"] Oct 02 11:13:31 crc kubenswrapper[4835]: E1002 11:13:31.045439 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493f4d8e-1585-4f10-bc9a-f36133834bf4" containerName="ovn-config" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.045482 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="493f4d8e-1585-4f10-bc9a-f36133834bf4" containerName="ovn-config" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.045705 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="493f4d8e-1585-4f10-bc9a-f36133834bf4" containerName="ovn-config" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.046490 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bsb2r" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.052270 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bsb2r"] Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.052448 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.148572 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ntmw9"] Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.149700 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ntmw9" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.175566 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ntmw9"] Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.223056 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpqxd\" (UniqueName: \"kubernetes.io/projected/caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638-kube-api-access-wpqxd\") pod \"cinder-db-create-bsb2r\" (UID: \"caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638\") " pod="openstack/cinder-db-create-bsb2r" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.325013 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hltz\" (UniqueName: \"kubernetes.io/projected/90716226-5bad-4b04-92d3-b3efdd7efd6d-kube-api-access-4hltz\") pod \"barbican-db-create-ntmw9\" (UID: \"90716226-5bad-4b04-92d3-b3efdd7efd6d\") " pod="openstack/barbican-db-create-ntmw9" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.325129 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpqxd\" (UniqueName: \"kubernetes.io/projected/caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638-kube-api-access-wpqxd\") pod \"cinder-db-create-bsb2r\" (UID: \"caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638\") " pod="openstack/cinder-db-create-bsb2r" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.346859 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-9gkdm"] Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.348142 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9gkdm" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.357049 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpqxd\" (UniqueName: \"kubernetes.io/projected/caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638-kube-api-access-wpqxd\") pod \"cinder-db-create-bsb2r\" (UID: \"caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638\") " pod="openstack/cinder-db-create-bsb2r" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.370130 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bsb2r" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.379067 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9gkdm"] Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.426122 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hltz\" (UniqueName: \"kubernetes.io/projected/90716226-5bad-4b04-92d3-b3efdd7efd6d-kube-api-access-4hltz\") pod \"barbican-db-create-ntmw9\" (UID: \"90716226-5bad-4b04-92d3-b3efdd7efd6d\") " pod="openstack/barbican-db-create-ntmw9" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.435106 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-hbr8l"] Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.436337 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.438280 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.442367 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.442593 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.442706 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j5nt8" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.449059 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hbr8l"] Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.449461 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hltz\" (UniqueName: \"kubernetes.io/projected/90716226-5bad-4b04-92d3-b3efdd7efd6d-kube-api-access-4hltz\") pod \"barbican-db-create-ntmw9\" (UID: \"90716226-5bad-4b04-92d3-b3efdd7efd6d\") " pod="openstack/barbican-db-create-ntmw9" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.470787 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ntmw9" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.528326 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8dqh\" (UniqueName: \"kubernetes.io/projected/aca1fdbc-9f7c-43fd-8bd3-812b87bbd432-kube-api-access-t8dqh\") pod \"neutron-db-create-9gkdm\" (UID: \"aca1fdbc-9f7c-43fd-8bd3-812b87bbd432\") " pod="openstack/neutron-db-create-9gkdm" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.528564 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-combined-ca-bundle\") pod \"keystone-db-sync-hbr8l\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.528648 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-config-data\") pod \"keystone-db-sync-hbr8l\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.528731 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lfld\" (UniqueName: \"kubernetes.io/projected/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-kube-api-access-4lfld\") pod \"keystone-db-sync-hbr8l\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.630752 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-combined-ca-bundle\") pod \"keystone-db-sync-hbr8l\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.630801 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-config-data\") pod \"keystone-db-sync-hbr8l\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.630832 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lfld\" (UniqueName: \"kubernetes.io/projected/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-kube-api-access-4lfld\") pod \"keystone-db-sync-hbr8l\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.630882 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8dqh\" (UniqueName: \"kubernetes.io/projected/aca1fdbc-9f7c-43fd-8bd3-812b87bbd432-kube-api-access-t8dqh\") pod \"neutron-db-create-9gkdm\" (UID: \"aca1fdbc-9f7c-43fd-8bd3-812b87bbd432\") " pod="openstack/neutron-db-create-9gkdm" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.635374 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-combined-ca-bundle\") pod \"keystone-db-sync-hbr8l\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.649076 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-config-data\") pod \"keystone-db-sync-hbr8l\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.660381 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lfld\" (UniqueName: \"kubernetes.io/projected/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-kube-api-access-4lfld\") pod \"keystone-db-sync-hbr8l\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.660896 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8dqh\" (UniqueName: \"kubernetes.io/projected/aca1fdbc-9f7c-43fd-8bd3-812b87bbd432-kube-api-access-t8dqh\") pod \"neutron-db-create-9gkdm\" (UID: \"aca1fdbc-9f7c-43fd-8bd3-812b87bbd432\") " pod="openstack/neutron-db-create-9gkdm" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.726046 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9gkdm" Oct 02 11:13:31 crc kubenswrapper[4835]: I1002 11:13:31.784719 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:34 crc kubenswrapper[4835]: W1002 11:13:34.790447 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c1e8272_6be9_43d5_99e6_571b3d2a5ba1.slice/crio-f60890608f26ade5a882fe896847715c5b62fb431622dde50b415b56cae1e032 WatchSource:0}: Error finding container f60890608f26ade5a882fe896847715c5b62fb431622dde50b415b56cae1e032: Status 404 returned error can't find the container with id f60890608f26ade5a882fe896847715c5b62fb431622dde50b415b56cae1e032 Oct 02 11:13:34 crc kubenswrapper[4835]: I1002 11:13:34.796444 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hbr8l"] Oct 02 11:13:34 crc kubenswrapper[4835]: I1002 11:13:34.833255 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bsb2r"] Oct 02 11:13:34 crc kubenswrapper[4835]: W1002 11:13:34.836080 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaaeb6a6_bdf2_48b9_9df8_0a2f5cb6e638.slice/crio-4e876416522e49200e64cebab8b05f7d0945cc77ead8b372aa32c0d048feaf28 WatchSource:0}: Error finding container 4e876416522e49200e64cebab8b05f7d0945cc77ead8b372aa32c0d048feaf28: Status 404 returned error can't find the container with id 4e876416522e49200e64cebab8b05f7d0945cc77ead8b372aa32c0d048feaf28 Oct 02 11:13:34 crc kubenswrapper[4835]: I1002 11:13:34.974682 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ntmw9"] Oct 02 11:13:34 crc kubenswrapper[4835]: I1002 11:13:34.985132 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9gkdm"] Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.711499 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ntmw9" event={"ID":"90716226-5bad-4b04-92d3-b3efdd7efd6d","Type":"ContainerStarted","Data":"4b9690ce3721de722889d416e854e99badf7536706eafcb2b68ce3d7017f3596"} Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.711883 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ntmw9" event={"ID":"90716226-5bad-4b04-92d3-b3efdd7efd6d","Type":"ContainerStarted","Data":"ce125bc96ababa97de1df4aa79d4ddc5b4ab9a819a429af774719ecf37bef475"} Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.714304 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hbr8l" event={"ID":"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1","Type":"ContainerStarted","Data":"f60890608f26ade5a882fe896847715c5b62fb431622dde50b415b56cae1e032"} Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.716589 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9gkdm" event={"ID":"aca1fdbc-9f7c-43fd-8bd3-812b87bbd432","Type":"ContainerStarted","Data":"3f05e99f67eb69762c5e9de1350c42d891c293c38ddcdb2539719037d67fcc33"} Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.716619 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9gkdm" event={"ID":"aca1fdbc-9f7c-43fd-8bd3-812b87bbd432","Type":"ContainerStarted","Data":"a7fe29914fec5f16c7d499824995490b1c2ad5953de491ff45b64a85069dcb53"} Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.717936 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7zkdc" event={"ID":"d9246fcb-4999-4c04-8d46-729b57a896ef","Type":"ContainerStarted","Data":"9827952f27a61a56cefbf3fd9d3faf8b888864042d0fafa80597e63a5c2288d5"} Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.723152 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bsb2r" event={"ID":"caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638","Type":"ContainerStarted","Data":"3f3f43474c9b2c2a8e0119c1a1d4fd22a9f2210a32ddc9751185fc54391f00ee"} Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.723198 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bsb2r" event={"ID":"caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638","Type":"ContainerStarted","Data":"4e876416522e49200e64cebab8b05f7d0945cc77ead8b372aa32c0d048feaf28"} Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.727409 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-ntmw9" podStartSLOduration=4.727386961 podStartE2EDuration="4.727386961s" podCreationTimestamp="2025-10-02 11:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:35.726100804 +0000 UTC m=+1092.286008395" watchObservedRunningTime="2025-10-02 11:13:35.727386961 +0000 UTC m=+1092.287294552" Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.752710 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-9gkdm" podStartSLOduration=4.752682785 podStartE2EDuration="4.752682785s" podCreationTimestamp="2025-10-02 11:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:35.74203117 +0000 UTC m=+1092.301938751" watchObservedRunningTime="2025-10-02 11:13:35.752682785 +0000 UTC m=+1092.312590376" Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.762196 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-bsb2r" podStartSLOduration=4.762140695 podStartE2EDuration="4.762140695s" podCreationTimestamp="2025-10-02 11:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:35.760152558 +0000 UTC m=+1092.320060149" watchObservedRunningTime="2025-10-02 11:13:35.762140695 +0000 UTC m=+1092.322048306" Oct 02 11:13:35 crc kubenswrapper[4835]: I1002 11:13:35.785683 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-7zkdc" podStartSLOduration=2.607452352 podStartE2EDuration="16.785664749s" podCreationTimestamp="2025-10-02 11:13:19 +0000 UTC" firstStartedPulling="2025-10-02 11:13:20.679716201 +0000 UTC m=+1077.239623782" lastFinishedPulling="2025-10-02 11:13:34.857928598 +0000 UTC m=+1091.417836179" observedRunningTime="2025-10-02 11:13:35.782504888 +0000 UTC m=+1092.342412489" watchObservedRunningTime="2025-10-02 11:13:35.785664749 +0000 UTC m=+1092.345572320" Oct 02 11:13:36 crc kubenswrapper[4835]: I1002 11:13:36.733105 4835 generic.go:334] "Generic (PLEG): container finished" podID="90716226-5bad-4b04-92d3-b3efdd7efd6d" containerID="4b9690ce3721de722889d416e854e99badf7536706eafcb2b68ce3d7017f3596" exitCode=0 Oct 02 11:13:36 crc kubenswrapper[4835]: I1002 11:13:36.733170 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ntmw9" event={"ID":"90716226-5bad-4b04-92d3-b3efdd7efd6d","Type":"ContainerDied","Data":"4b9690ce3721de722889d416e854e99badf7536706eafcb2b68ce3d7017f3596"} Oct 02 11:13:36 crc kubenswrapper[4835]: I1002 11:13:36.739478 4835 generic.go:334] "Generic (PLEG): container finished" podID="aca1fdbc-9f7c-43fd-8bd3-812b87bbd432" containerID="3f05e99f67eb69762c5e9de1350c42d891c293c38ddcdb2539719037d67fcc33" exitCode=0 Oct 02 11:13:36 crc kubenswrapper[4835]: I1002 11:13:36.739514 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9gkdm" event={"ID":"aca1fdbc-9f7c-43fd-8bd3-812b87bbd432","Type":"ContainerDied","Data":"3f05e99f67eb69762c5e9de1350c42d891c293c38ddcdb2539719037d67fcc33"} Oct 02 11:13:36 crc kubenswrapper[4835]: I1002 11:13:36.741734 4835 generic.go:334] "Generic (PLEG): container finished" podID="caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638" containerID="3f3f43474c9b2c2a8e0119c1a1d4fd22a9f2210a32ddc9751185fc54391f00ee" exitCode=0 Oct 02 11:13:36 crc kubenswrapper[4835]: I1002 11:13:36.742758 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bsb2r" event={"ID":"caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638","Type":"ContainerDied","Data":"3f3f43474c9b2c2a8e0119c1a1d4fd22a9f2210a32ddc9751185fc54391f00ee"} Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.126614 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9gkdm" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.132036 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bsb2r" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.147381 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ntmw9" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.291981 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpqxd\" (UniqueName: \"kubernetes.io/projected/caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638-kube-api-access-wpqxd\") pod \"caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638\" (UID: \"caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638\") " Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.292112 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8dqh\" (UniqueName: \"kubernetes.io/projected/aca1fdbc-9f7c-43fd-8bd3-812b87bbd432-kube-api-access-t8dqh\") pod \"aca1fdbc-9f7c-43fd-8bd3-812b87bbd432\" (UID: \"aca1fdbc-9f7c-43fd-8bd3-812b87bbd432\") " Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.292336 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hltz\" (UniqueName: \"kubernetes.io/projected/90716226-5bad-4b04-92d3-b3efdd7efd6d-kube-api-access-4hltz\") pod \"90716226-5bad-4b04-92d3-b3efdd7efd6d\" (UID: \"90716226-5bad-4b04-92d3-b3efdd7efd6d\") " Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.297055 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90716226-5bad-4b04-92d3-b3efdd7efd6d-kube-api-access-4hltz" (OuterVolumeSpecName: "kube-api-access-4hltz") pod "90716226-5bad-4b04-92d3-b3efdd7efd6d" (UID: "90716226-5bad-4b04-92d3-b3efdd7efd6d"). InnerVolumeSpecName "kube-api-access-4hltz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.297128 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638-kube-api-access-wpqxd" (OuterVolumeSpecName: "kube-api-access-wpqxd") pod "caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638" (UID: "caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638"). InnerVolumeSpecName "kube-api-access-wpqxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.298297 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca1fdbc-9f7c-43fd-8bd3-812b87bbd432-kube-api-access-t8dqh" (OuterVolumeSpecName: "kube-api-access-t8dqh") pod "aca1fdbc-9f7c-43fd-8bd3-812b87bbd432" (UID: "aca1fdbc-9f7c-43fd-8bd3-812b87bbd432"). InnerVolumeSpecName "kube-api-access-t8dqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.394548 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hltz\" (UniqueName: \"kubernetes.io/projected/90716226-5bad-4b04-92d3-b3efdd7efd6d-kube-api-access-4hltz\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.394600 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpqxd\" (UniqueName: \"kubernetes.io/projected/caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638-kube-api-access-wpqxd\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.394613 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8dqh\" (UniqueName: \"kubernetes.io/projected/aca1fdbc-9f7c-43fd-8bd3-812b87bbd432-kube-api-access-t8dqh\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.769492 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hbr8l" event={"ID":"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1","Type":"ContainerStarted","Data":"d36b35f4dc40f412e5bf505642090240e0a3bcb55aa5f22d2f9094e7707f035c"} Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.771803 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9gkdm" event={"ID":"aca1fdbc-9f7c-43fd-8bd3-812b87bbd432","Type":"ContainerDied","Data":"a7fe29914fec5f16c7d499824995490b1c2ad5953de491ff45b64a85069dcb53"} Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.771856 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7fe29914fec5f16c7d499824995490b1c2ad5953de491ff45b64a85069dcb53" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.772098 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9gkdm" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.773889 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bsb2r" event={"ID":"caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638","Type":"ContainerDied","Data":"4e876416522e49200e64cebab8b05f7d0945cc77ead8b372aa32c0d048feaf28"} Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.773926 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e876416522e49200e64cebab8b05f7d0945cc77ead8b372aa32c0d048feaf28" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.774002 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bsb2r" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.777995 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ntmw9" event={"ID":"90716226-5bad-4b04-92d3-b3efdd7efd6d","Type":"ContainerDied","Data":"ce125bc96ababa97de1df4aa79d4ddc5b4ab9a819a429af774719ecf37bef475"} Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.778032 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce125bc96ababa97de1df4aa79d4ddc5b4ab9a819a429af774719ecf37bef475" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.778047 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ntmw9" Oct 02 11:13:39 crc kubenswrapper[4835]: I1002 11:13:39.804054 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-hbr8l" podStartSLOduration=4.422626519 podStartE2EDuration="8.804030103s" podCreationTimestamp="2025-10-02 11:13:31 +0000 UTC" firstStartedPulling="2025-10-02 11:13:34.79302273 +0000 UTC m=+1091.352930311" lastFinishedPulling="2025-10-02 11:13:39.174426314 +0000 UTC m=+1095.734333895" observedRunningTime="2025-10-02 11:13:39.787773508 +0000 UTC m=+1096.347681089" watchObservedRunningTime="2025-10-02 11:13:39.804030103 +0000 UTC m=+1096.363937684" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.191202 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b95b-account-create-jlqb5"] Oct 02 11:13:41 crc kubenswrapper[4835]: E1002 11:13:41.191898 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638" containerName="mariadb-database-create" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.191913 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638" containerName="mariadb-database-create" Oct 02 11:13:41 crc kubenswrapper[4835]: E1002 11:13:41.191931 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca1fdbc-9f7c-43fd-8bd3-812b87bbd432" containerName="mariadb-database-create" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.191937 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca1fdbc-9f7c-43fd-8bd3-812b87bbd432" containerName="mariadb-database-create" Oct 02 11:13:41 crc kubenswrapper[4835]: E1002 11:13:41.191955 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90716226-5bad-4b04-92d3-b3efdd7efd6d" containerName="mariadb-database-create" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.191961 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="90716226-5bad-4b04-92d3-b3efdd7efd6d" containerName="mariadb-database-create" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.192113 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="90716226-5bad-4b04-92d3-b3efdd7efd6d" containerName="mariadb-database-create" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.192121 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638" containerName="mariadb-database-create" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.192137 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca1fdbc-9f7c-43fd-8bd3-812b87bbd432" containerName="mariadb-database-create" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.192718 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b95b-account-create-jlqb5" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.202691 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b95b-account-create-jlqb5"] Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.202795 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.329561 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9mmq\" (UniqueName: \"kubernetes.io/projected/92bcb867-7f39-415a-8565-027fa8d2963e-kube-api-access-r9mmq\") pod \"cinder-b95b-account-create-jlqb5\" (UID: \"92bcb867-7f39-415a-8565-027fa8d2963e\") " pod="openstack/cinder-b95b-account-create-jlqb5" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.430975 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9mmq\" (UniqueName: \"kubernetes.io/projected/92bcb867-7f39-415a-8565-027fa8d2963e-kube-api-access-r9mmq\") pod \"cinder-b95b-account-create-jlqb5\" (UID: \"92bcb867-7f39-415a-8565-027fa8d2963e\") " pod="openstack/cinder-b95b-account-create-jlqb5" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.451496 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9mmq\" (UniqueName: \"kubernetes.io/projected/92bcb867-7f39-415a-8565-027fa8d2963e-kube-api-access-r9mmq\") pod \"cinder-b95b-account-create-jlqb5\" (UID: \"92bcb867-7f39-415a-8565-027fa8d2963e\") " pod="openstack/cinder-b95b-account-create-jlqb5" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.561858 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b95b-account-create-jlqb5" Oct 02 11:13:41 crc kubenswrapper[4835]: I1002 11:13:41.983477 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b95b-account-create-jlqb5"] Oct 02 11:13:41 crc kubenswrapper[4835]: W1002 11:13:41.988043 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92bcb867_7f39_415a_8565_027fa8d2963e.slice/crio-f75e687bb405f5147cd5023dccca9c5c52269045a8770336b3e4b4763a968976 WatchSource:0}: Error finding container f75e687bb405f5147cd5023dccca9c5c52269045a8770336b3e4b4763a968976: Status 404 returned error can't find the container with id f75e687bb405f5147cd5023dccca9c5c52269045a8770336b3e4b4763a968976 Oct 02 11:13:42 crc kubenswrapper[4835]: I1002 11:13:42.804775 4835 generic.go:334] "Generic (PLEG): container finished" podID="92bcb867-7f39-415a-8565-027fa8d2963e" containerID="edf69bdd7b67918d277fa4aeed8465b90d59b51310177c003f173865baf7b16a" exitCode=0 Oct 02 11:13:42 crc kubenswrapper[4835]: I1002 11:13:42.804833 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b95b-account-create-jlqb5" event={"ID":"92bcb867-7f39-415a-8565-027fa8d2963e","Type":"ContainerDied","Data":"edf69bdd7b67918d277fa4aeed8465b90d59b51310177c003f173865baf7b16a"} Oct 02 11:13:42 crc kubenswrapper[4835]: I1002 11:13:42.804868 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b95b-account-create-jlqb5" event={"ID":"92bcb867-7f39-415a-8565-027fa8d2963e","Type":"ContainerStarted","Data":"f75e687bb405f5147cd5023dccca9c5c52269045a8770336b3e4b4763a968976"} Oct 02 11:13:43 crc kubenswrapper[4835]: I1002 11:13:43.817467 4835 generic.go:334] "Generic (PLEG): container finished" podID="2c1e8272-6be9-43d5-99e6-571b3d2a5ba1" containerID="d36b35f4dc40f412e5bf505642090240e0a3bcb55aa5f22d2f9094e7707f035c" exitCode=0 Oct 02 11:13:43 crc kubenswrapper[4835]: I1002 11:13:43.817556 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hbr8l" event={"ID":"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1","Type":"ContainerDied","Data":"d36b35f4dc40f412e5bf505642090240e0a3bcb55aa5f22d2f9094e7707f035c"} Oct 02 11:13:44 crc kubenswrapper[4835]: I1002 11:13:44.150203 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b95b-account-create-jlqb5" Oct 02 11:13:44 crc kubenswrapper[4835]: I1002 11:13:44.283652 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9mmq\" (UniqueName: \"kubernetes.io/projected/92bcb867-7f39-415a-8565-027fa8d2963e-kube-api-access-r9mmq\") pod \"92bcb867-7f39-415a-8565-027fa8d2963e\" (UID: \"92bcb867-7f39-415a-8565-027fa8d2963e\") " Oct 02 11:13:44 crc kubenswrapper[4835]: I1002 11:13:44.290462 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92bcb867-7f39-415a-8565-027fa8d2963e-kube-api-access-r9mmq" (OuterVolumeSpecName: "kube-api-access-r9mmq") pod "92bcb867-7f39-415a-8565-027fa8d2963e" (UID: "92bcb867-7f39-415a-8565-027fa8d2963e"). InnerVolumeSpecName "kube-api-access-r9mmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:44 crc kubenswrapper[4835]: I1002 11:13:44.386259 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9mmq\" (UniqueName: \"kubernetes.io/projected/92bcb867-7f39-415a-8565-027fa8d2963e-kube-api-access-r9mmq\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:44 crc kubenswrapper[4835]: I1002 11:13:44.828467 4835 generic.go:334] "Generic (PLEG): container finished" podID="d9246fcb-4999-4c04-8d46-729b57a896ef" containerID="9827952f27a61a56cefbf3fd9d3faf8b888864042d0fafa80597e63a5c2288d5" exitCode=0 Oct 02 11:13:44 crc kubenswrapper[4835]: I1002 11:13:44.828588 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7zkdc" event={"ID":"d9246fcb-4999-4c04-8d46-729b57a896ef","Type":"ContainerDied","Data":"9827952f27a61a56cefbf3fd9d3faf8b888864042d0fafa80597e63a5c2288d5"} Oct 02 11:13:44 crc kubenswrapper[4835]: I1002 11:13:44.833520 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b95b-account-create-jlqb5" event={"ID":"92bcb867-7f39-415a-8565-027fa8d2963e","Type":"ContainerDied","Data":"f75e687bb405f5147cd5023dccca9c5c52269045a8770336b3e4b4763a968976"} Oct 02 11:13:44 crc kubenswrapper[4835]: I1002 11:13:44.833555 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b95b-account-create-jlqb5" Oct 02 11:13:44 crc kubenswrapper[4835]: I1002 11:13:44.833568 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f75e687bb405f5147cd5023dccca9c5c52269045a8770336b3e4b4763a968976" Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.223341 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.305426 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-combined-ca-bundle\") pod \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.333836 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c1e8272-6be9-43d5-99e6-571b3d2a5ba1" (UID: "2c1e8272-6be9-43d5-99e6-571b3d2a5ba1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.408422 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lfld\" (UniqueName: \"kubernetes.io/projected/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-kube-api-access-4lfld\") pod \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.408514 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-config-data\") pod \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\" (UID: \"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1\") " Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.409165 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.412511 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-kube-api-access-4lfld" (OuterVolumeSpecName: "kube-api-access-4lfld") pod "2c1e8272-6be9-43d5-99e6-571b3d2a5ba1" (UID: "2c1e8272-6be9-43d5-99e6-571b3d2a5ba1"). InnerVolumeSpecName "kube-api-access-4lfld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.458267 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-config-data" (OuterVolumeSpecName: "config-data") pod "2c1e8272-6be9-43d5-99e6-571b3d2a5ba1" (UID: "2c1e8272-6be9-43d5-99e6-571b3d2a5ba1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.511836 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lfld\" (UniqueName: \"kubernetes.io/projected/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-kube-api-access-4lfld\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.511883 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.851907 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hbr8l" event={"ID":"2c1e8272-6be9-43d5-99e6-571b3d2a5ba1","Type":"ContainerDied","Data":"f60890608f26ade5a882fe896847715c5b62fb431622dde50b415b56cae1e032"} Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.851938 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hbr8l" Oct 02 11:13:45 crc kubenswrapper[4835]: I1002 11:13:45.851968 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f60890608f26ade5a882fe896847715c5b62fb431622dde50b415b56cae1e032" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.155869 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4s5m7"] Oct 02 11:13:46 crc kubenswrapper[4835]: E1002 11:13:46.156507 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c1e8272-6be9-43d5-99e6-571b3d2a5ba1" containerName="keystone-db-sync" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.156533 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c1e8272-6be9-43d5-99e6-571b3d2a5ba1" containerName="keystone-db-sync" Oct 02 11:13:46 crc kubenswrapper[4835]: E1002 11:13:46.156576 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bcb867-7f39-415a-8565-027fa8d2963e" containerName="mariadb-account-create" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.156589 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bcb867-7f39-415a-8565-027fa8d2963e" containerName="mariadb-account-create" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.156812 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c1e8272-6be9-43d5-99e6-571b3d2a5ba1" containerName="keystone-db-sync" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.156839 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="92bcb867-7f39-415a-8565-027fa8d2963e" containerName="mariadb-account-create" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.157710 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.164756 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.164966 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.165516 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j5nt8" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.168373 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.170277 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-nnmhb"] Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.172243 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.186700 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4s5m7"] Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.202320 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-nnmhb"] Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.225455 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.225902 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.225932 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-credential-keys\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.225956 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.226021 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-config\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.226074 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-fernet-keys\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.226109 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt2gb\" (UniqueName: \"kubernetes.io/projected/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-kube-api-access-mt2gb\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.226254 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkhsx\" (UniqueName: \"kubernetes.io/projected/7b56f7fb-c3a1-4e13-af09-b218aab27191-kube-api-access-gkhsx\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.226364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-combined-ca-bundle\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.226412 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-config-data\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.226528 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-scripts\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.328038 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.328135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.328165 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-credential-keys\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.328186 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.328234 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-config\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.328274 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-fernet-keys\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.328303 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt2gb\" (UniqueName: \"kubernetes.io/projected/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-kube-api-access-mt2gb\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.328334 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkhsx\" (UniqueName: \"kubernetes.io/projected/7b56f7fb-c3a1-4e13-af09-b218aab27191-kube-api-access-gkhsx\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.328360 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-combined-ca-bundle\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.328391 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-config-data\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.328434 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-scripts\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.330260 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-config\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.332246 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.332925 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.337994 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.344124 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-scripts\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.344590 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-credential-keys\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.345526 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-config-data\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.346182 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-fernet-keys\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.349976 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-combined-ca-bundle\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.370856 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkhsx\" (UniqueName: \"kubernetes.io/projected/7b56f7fb-c3a1-4e13-af09-b218aab27191-kube-api-access-gkhsx\") pod \"dnsmasq-dns-75bb4695fc-nnmhb\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.379874 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt2gb\" (UniqueName: \"kubernetes.io/projected/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-kube-api-access-mt2gb\") pod \"keystone-bootstrap-4s5m7\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.469623 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.489089 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-2vdk6"] Oct 02 11:13:46 crc kubenswrapper[4835]: E1002 11:13:46.490377 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9246fcb-4999-4c04-8d46-729b57a896ef" containerName="glance-db-sync" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.490428 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9246fcb-4999-4c04-8d46-729b57a896ef" containerName="glance-db-sync" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.491033 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9246fcb-4999-4c04-8d46-729b57a896ef" containerName="glance-db-sync" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.492112 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.502195 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.502197 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.502309 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lmqcj" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.536421 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2vdk6"] Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.547294 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.547859 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-combined-ca-bundle\") pod \"d9246fcb-4999-4c04-8d46-729b57a896ef\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.548017 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-config-data\") pod \"d9246fcb-4999-4c04-8d46-729b57a896ef\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.548129 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-db-sync-config-data\") pod \"d9246fcb-4999-4c04-8d46-729b57a896ef\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.548196 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2wvl\" (UniqueName: \"kubernetes.io/projected/d9246fcb-4999-4c04-8d46-729b57a896ef-kube-api-access-k2wvl\") pod \"d9246fcb-4999-4c04-8d46-729b57a896ef\" (UID: \"d9246fcb-4999-4c04-8d46-729b57a896ef\") " Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.554802 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jwqkm"] Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.555972 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.558180 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.561643 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8vv6j" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.561864 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.562075 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.566527 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9246fcb-4999-4c04-8d46-729b57a896ef-kube-api-access-k2wvl" (OuterVolumeSpecName: "kube-api-access-k2wvl") pod "d9246fcb-4999-4c04-8d46-729b57a896ef" (UID: "d9246fcb-4999-4c04-8d46-729b57a896ef"). InnerVolumeSpecName "kube-api-access-k2wvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.579009 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d9246fcb-4999-4c04-8d46-729b57a896ef" (UID: "d9246fcb-4999-4c04-8d46-729b57a896ef"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.585298 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jwqkm"] Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.599455 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-nnmhb"] Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.617049 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xqhpc"] Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.619254 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.620298 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9246fcb-4999-4c04-8d46-729b57a896ef" (UID: "d9246fcb-4999-4c04-8d46-729b57a896ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.635659 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xqhpc"] Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.649860 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czplq\" (UniqueName: \"kubernetes.io/projected/01665ace-3f34-4029-a202-6f350e8497f9-kube-api-access-czplq\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.649952 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4svdb\" (UniqueName: \"kubernetes.io/projected/fc534d7c-ef08-44e5-b56d-d3421477c51d-kube-api-access-4svdb\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.649981 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-db-sync-config-data\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.650017 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-combined-ca-bundle\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.650040 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-combined-ca-bundle\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.650072 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-scripts\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.650103 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-config-data\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.650558 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-config-data\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.650659 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc534d7c-ef08-44e5-b56d-d3421477c51d-etc-machine-id\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.650731 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-scripts\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.650777 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01665ace-3f34-4029-a202-6f350e8497f9-logs\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.650933 4835 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.650954 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2wvl\" (UniqueName: \"kubernetes.io/projected/d9246fcb-4999-4c04-8d46-729b57a896ef-kube-api-access-k2wvl\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.650967 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.653498 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.657083 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.661810 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.662908 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.684452 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-config-data" (OuterVolumeSpecName: "config-data") pod "d9246fcb-4999-4c04-8d46-729b57a896ef" (UID: "d9246fcb-4999-4c04-8d46-729b57a896ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.692415 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753342 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-log-httpd\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753410 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-config\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753452 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4svdb\" (UniqueName: \"kubernetes.io/projected/fc534d7c-ef08-44e5-b56d-d3421477c51d-kube-api-access-4svdb\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753484 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753506 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-db-sync-config-data\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753527 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753556 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753586 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-combined-ca-bundle\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753612 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-combined-ca-bundle\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753642 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fcms\" (UniqueName: \"kubernetes.io/projected/61fa3266-8b7b-45a8-a25f-e400fe1252f1-kube-api-access-2fcms\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753674 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-config-data\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753723 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-scripts\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753754 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-scripts\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753798 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-config-data\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753853 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753878 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-config-data\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753901 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc534d7c-ef08-44e5-b56d-d3421477c51d-etc-machine-id\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753925 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-scripts\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753947 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01665ace-3f34-4029-a202-6f350e8497f9-logs\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.753990 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-run-httpd\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.758510 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br785\" (UniqueName: \"kubernetes.io/projected/128cb0eb-73e9-4945-ab6e-212f20b7306a-kube-api-access-br785\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.758593 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.758641 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czplq\" (UniqueName: \"kubernetes.io/projected/01665ace-3f34-4029-a202-6f350e8497f9-kube-api-access-czplq\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.758866 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9246fcb-4999-4c04-8d46-729b57a896ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.763687 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-db-sync-config-data\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.763933 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01665ace-3f34-4029-a202-6f350e8497f9-logs\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.765347 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-config-data\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.765508 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc534d7c-ef08-44e5-b56d-d3421477c51d-etc-machine-id\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.772907 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-scripts\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.773550 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-combined-ca-bundle\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.774755 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-config-data\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.778832 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-scripts\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.780806 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4svdb\" (UniqueName: \"kubernetes.io/projected/fc534d7c-ef08-44e5-b56d-d3421477c51d-kube-api-access-4svdb\") pod \"cinder-db-sync-2vdk6\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.784245 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-combined-ca-bundle\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.784498 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czplq\" (UniqueName: \"kubernetes.io/projected/01665ace-3f34-4029-a202-6f350e8497f9-kube-api-access-czplq\") pod \"placement-db-sync-jwqkm\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.848394 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.860833 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-log-httpd\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.860887 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-config\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.860924 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.860942 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.860967 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.861002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcms\" (UniqueName: \"kubernetes.io/projected/61fa3266-8b7b-45a8-a25f-e400fe1252f1-kube-api-access-2fcms\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.861018 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-config-data\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.861045 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-scripts\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.861095 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.861148 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-run-httpd\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.861176 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br785\" (UniqueName: \"kubernetes.io/projected/128cb0eb-73e9-4945-ab6e-212f20b7306a-kube-api-access-br785\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.861200 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.866973 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.866969 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-run-httpd\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.867361 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-log-httpd\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.868108 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-config\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.870541 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.863555 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.871773 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.875571 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.879756 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-scripts\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.883919 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7zkdc" event={"ID":"d9246fcb-4999-4c04-8d46-729b57a896ef","Type":"ContainerDied","Data":"a244507335f3b7d12ee58576c98f84f3061f6ee5222dc84e8aa5cb5e3dbf9598"} Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.883983 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a244507335f3b7d12ee58576c98f84f3061f6ee5222dc84e8aa5cb5e3dbf9598" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.884059 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7zkdc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.884517 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-config-data\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.889135 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fcms\" (UniqueName: \"kubernetes.io/projected/61fa3266-8b7b-45a8-a25f-e400fe1252f1-kube-api-access-2fcms\") pod \"ceilometer-0\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " pod="openstack/ceilometer-0" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.897439 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br785\" (UniqueName: \"kubernetes.io/projected/128cb0eb-73e9-4945-ab6e-212f20b7306a-kube-api-access-br785\") pod \"dnsmasq-dns-745b9ddc8c-xqhpc\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:46 crc kubenswrapper[4835]: I1002 11:13:46.971508 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jwqkm" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.007748 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.016282 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.268659 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-nnmhb"] Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.319408 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xqhpc"] Oct 02 11:13:47 crc kubenswrapper[4835]: W1002 11:13:47.337159 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b56f7fb_c3a1_4e13_af09_b218aab27191.slice/crio-3315f1e065ca04456a9d457027316a1affb29d12a2c21cdcc15574491a74d044 WatchSource:0}: Error finding container 3315f1e065ca04456a9d457027316a1affb29d12a2c21cdcc15574491a74d044: Status 404 returned error can't find the container with id 3315f1e065ca04456a9d457027316a1affb29d12a2c21cdcc15574491a74d044 Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.355792 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4s5m7"] Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.365553 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-d4ngg"] Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.372521 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.402993 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-d4ngg"] Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.484297 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.484643 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgbgs\" (UniqueName: \"kubernetes.io/projected/d8a71c32-05c6-4635-9e6a-3d72d59edd72-kube-api-access-dgbgs\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.484701 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.484737 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-config\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.484788 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.514976 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2vdk6"] Oct 02 11:13:47 crc kubenswrapper[4835]: W1002 11:13:47.542567 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc534d7c_ef08_44e5_b56d_d3421477c51d.slice/crio-1ce615fb1caaf17c8de25aee44904ab29fb81696cd112509a16666576484a582 WatchSource:0}: Error finding container 1ce615fb1caaf17c8de25aee44904ab29fb81696cd112509a16666576484a582: Status 404 returned error can't find the container with id 1ce615fb1caaf17c8de25aee44904ab29fb81696cd112509a16666576484a582 Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.588030 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgbgs\" (UniqueName: \"kubernetes.io/projected/d8a71c32-05c6-4635-9e6a-3d72d59edd72-kube-api-access-dgbgs\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.588103 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.588137 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-config\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.588178 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.588264 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.589170 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.590009 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.598103 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.599047 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-config\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.639282 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgbgs\" (UniqueName: \"kubernetes.io/projected/d8a71c32-05c6-4635-9e6a-3d72d59edd72-kube-api-access-dgbgs\") pod \"dnsmasq-dns-7987f74bbc-d4ngg\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.789038 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.919213 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2vdk6" event={"ID":"fc534d7c-ef08-44e5-b56d-d3421477c51d","Type":"ContainerStarted","Data":"1ce615fb1caaf17c8de25aee44904ab29fb81696cd112509a16666576484a582"} Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.931681 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" event={"ID":"7b56f7fb-c3a1-4e13-af09-b218aab27191","Type":"ContainerStarted","Data":"3315f1e065ca04456a9d457027316a1affb29d12a2c21cdcc15574491a74d044"} Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.934573 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4s5m7" event={"ID":"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e","Type":"ContainerStarted","Data":"78906bfb21ff2ad9bf11a249f1055013fb0b8cdc3afb343be8d537a89c9cc7ff"} Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.934599 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4s5m7" event={"ID":"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e","Type":"ContainerStarted","Data":"7515a1f07c43fbc8e3828b0aba056563d9af1a1dbc105768b0cbca0114028de5"} Oct 02 11:13:47 crc kubenswrapper[4835]: I1002 11:13:47.966752 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4s5m7" podStartSLOduration=1.966681847 podStartE2EDuration="1.966681847s" podCreationTimestamp="2025-10-02 11:13:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:47.959501061 +0000 UTC m=+1104.519408652" watchObservedRunningTime="2025-10-02 11:13:47.966681847 +0000 UTC m=+1104.526589428" Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.008156 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.043925 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xqhpc"] Oct 02 11:13:48 crc kubenswrapper[4835]: W1002 11:13:48.062675 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01665ace_3f34_4029_a202_6f350e8497f9.slice/crio-61f7c0f8e2d4024320d6ec93e9e1fecd7b158472a8e9477be491493b32f82854 WatchSource:0}: Error finding container 61f7c0f8e2d4024320d6ec93e9e1fecd7b158472a8e9477be491493b32f82854: Status 404 returned error can't find the container with id 61f7c0f8e2d4024320d6ec93e9e1fecd7b158472a8e9477be491493b32f82854 Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.063876 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jwqkm"] Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.551721 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-d4ngg"] Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.696092 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.958088 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jwqkm" event={"ID":"01665ace-3f34-4029-a202-6f350e8497f9","Type":"ContainerStarted","Data":"61f7c0f8e2d4024320d6ec93e9e1fecd7b158472a8e9477be491493b32f82854"} Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.965373 4835 generic.go:334] "Generic (PLEG): container finished" podID="128cb0eb-73e9-4945-ab6e-212f20b7306a" containerID="c0801be99f9c1f90d3bbe8426cbe287a92d817b17a8982e2f055c3607376b786" exitCode=0 Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.965471 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" event={"ID":"128cb0eb-73e9-4945-ab6e-212f20b7306a","Type":"ContainerDied","Data":"c0801be99f9c1f90d3bbe8426cbe287a92d817b17a8982e2f055c3607376b786"} Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.965510 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" event={"ID":"128cb0eb-73e9-4945-ab6e-212f20b7306a","Type":"ContainerStarted","Data":"87b21d581f00d851a3eab6507e29ddc1128d2899368507bab349e4347fae852e"} Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.969122 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" event={"ID":"d8a71c32-05c6-4635-9e6a-3d72d59edd72","Type":"ContainerStarted","Data":"1ce99fd53c32b01e37ee8998f4817e43e69b4c90bd979e6323ac6ce1f20cdec4"} Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.974749 4835 generic.go:334] "Generic (PLEG): container finished" podID="7b56f7fb-c3a1-4e13-af09-b218aab27191" containerID="afa61c9eab459b8ef8124fa5f0270cff88daedb2e3014d1775534c418c90958a" exitCode=0 Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.974826 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" event={"ID":"7b56f7fb-c3a1-4e13-af09-b218aab27191","Type":"ContainerDied","Data":"afa61c9eab459b8ef8124fa5f0270cff88daedb2e3014d1775534c418c90958a"} Oct 02 11:13:48 crc kubenswrapper[4835]: I1002 11:13:48.979673 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61fa3266-8b7b-45a8-a25f-e400fe1252f1","Type":"ContainerStarted","Data":"b3bb92927e413932e542ec7378c79783106619e77c7f99d73a30c9c02e61b516"} Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.470307 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.482363 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.566633 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-dns-svc\") pod \"128cb0eb-73e9-4945-ab6e-212f20b7306a\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.566872 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-sb\") pod \"128cb0eb-73e9-4945-ab6e-212f20b7306a\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.566945 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br785\" (UniqueName: \"kubernetes.io/projected/128cb0eb-73e9-4945-ab6e-212f20b7306a-kube-api-access-br785\") pod \"128cb0eb-73e9-4945-ab6e-212f20b7306a\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.567019 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-nb\") pod \"128cb0eb-73e9-4945-ab6e-212f20b7306a\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.567060 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-config\") pod \"128cb0eb-73e9-4945-ab6e-212f20b7306a\" (UID: \"128cb0eb-73e9-4945-ab6e-212f20b7306a\") " Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.573939 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128cb0eb-73e9-4945-ab6e-212f20b7306a-kube-api-access-br785" (OuterVolumeSpecName: "kube-api-access-br785") pod "128cb0eb-73e9-4945-ab6e-212f20b7306a" (UID: "128cb0eb-73e9-4945-ab6e-212f20b7306a"). InnerVolumeSpecName "kube-api-access-br785". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.602003 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "128cb0eb-73e9-4945-ab6e-212f20b7306a" (UID: "128cb0eb-73e9-4945-ab6e-212f20b7306a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.603415 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-config" (OuterVolumeSpecName: "config") pod "128cb0eb-73e9-4945-ab6e-212f20b7306a" (UID: "128cb0eb-73e9-4945-ab6e-212f20b7306a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.603563 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "128cb0eb-73e9-4945-ab6e-212f20b7306a" (UID: "128cb0eb-73e9-4945-ab6e-212f20b7306a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.604044 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "128cb0eb-73e9-4945-ab6e-212f20b7306a" (UID: "128cb0eb-73e9-4945-ab6e-212f20b7306a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.668627 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-config\") pod \"7b56f7fb-c3a1-4e13-af09-b218aab27191\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.668766 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkhsx\" (UniqueName: \"kubernetes.io/projected/7b56f7fb-c3a1-4e13-af09-b218aab27191-kube-api-access-gkhsx\") pod \"7b56f7fb-c3a1-4e13-af09-b218aab27191\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.668795 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-sb\") pod \"7b56f7fb-c3a1-4e13-af09-b218aab27191\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.668881 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-nb\") pod \"7b56f7fb-c3a1-4e13-af09-b218aab27191\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.668923 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-dns-svc\") pod \"7b56f7fb-c3a1-4e13-af09-b218aab27191\" (UID: \"7b56f7fb-c3a1-4e13-af09-b218aab27191\") " Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.669652 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.669674 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.669686 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br785\" (UniqueName: \"kubernetes.io/projected/128cb0eb-73e9-4945-ab6e-212f20b7306a-kube-api-access-br785\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.669695 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.669704 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/128cb0eb-73e9-4945-ab6e-212f20b7306a-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.676250 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b56f7fb-c3a1-4e13-af09-b218aab27191-kube-api-access-gkhsx" (OuterVolumeSpecName: "kube-api-access-gkhsx") pod "7b56f7fb-c3a1-4e13-af09-b218aab27191" (UID: "7b56f7fb-c3a1-4e13-af09-b218aab27191"). InnerVolumeSpecName "kube-api-access-gkhsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.701040 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7b56f7fb-c3a1-4e13-af09-b218aab27191" (UID: "7b56f7fb-c3a1-4e13-af09-b218aab27191"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.701116 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-config" (OuterVolumeSpecName: "config") pod "7b56f7fb-c3a1-4e13-af09-b218aab27191" (UID: "7b56f7fb-c3a1-4e13-af09-b218aab27191"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.701623 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b56f7fb-c3a1-4e13-af09-b218aab27191" (UID: "7b56f7fb-c3a1-4e13-af09-b218aab27191"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.708446 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7b56f7fb-c3a1-4e13-af09-b218aab27191" (UID: "7b56f7fb-c3a1-4e13-af09-b218aab27191"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.771913 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkhsx\" (UniqueName: \"kubernetes.io/projected/7b56f7fb-c3a1-4e13-af09-b218aab27191-kube-api-access-gkhsx\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.771963 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.771975 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.771985 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:49 crc kubenswrapper[4835]: I1002 11:13:49.771995 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b56f7fb-c3a1-4e13-af09-b218aab27191-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.011732 4835 generic.go:334] "Generic (PLEG): container finished" podID="d8a71c32-05c6-4635-9e6a-3d72d59edd72" containerID="935cd6c7e6e10a88cf01b25936f0cd81cc4a73e79043dc360e73b31005df8822" exitCode=0 Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.011803 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" event={"ID":"d8a71c32-05c6-4635-9e6a-3d72d59edd72","Type":"ContainerDied","Data":"935cd6c7e6e10a88cf01b25936f0cd81cc4a73e79043dc360e73b31005df8822"} Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.017409 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" event={"ID":"7b56f7fb-c3a1-4e13-af09-b218aab27191","Type":"ContainerDied","Data":"3315f1e065ca04456a9d457027316a1affb29d12a2c21cdcc15574491a74d044"} Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.017564 4835 scope.go:117] "RemoveContainer" containerID="afa61c9eab459b8ef8124fa5f0270cff88daedb2e3014d1775534c418c90958a" Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.017523 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-nnmhb" Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.023778 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" event={"ID":"128cb0eb-73e9-4945-ab6e-212f20b7306a","Type":"ContainerDied","Data":"87b21d581f00d851a3eab6507e29ddc1128d2899368507bab349e4347fae852e"} Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.023870 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xqhpc" Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.106861 4835 scope.go:117] "RemoveContainer" containerID="c0801be99f9c1f90d3bbe8426cbe287a92d817b17a8982e2f055c3607376b786" Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.133107 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-nnmhb"] Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.176055 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-nnmhb"] Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.227348 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xqhpc"] Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.246714 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xqhpc"] Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.321936 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128cb0eb-73e9-4945-ab6e-212f20b7306a" path="/var/lib/kubelet/pods/128cb0eb-73e9-4945-ab6e-212f20b7306a/volumes" Oct 02 11:13:50 crc kubenswrapper[4835]: I1002 11:13:50.322787 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b56f7fb-c3a1-4e13-af09-b218aab27191" path="/var/lib/kubelet/pods/7b56f7fb-c3a1-4e13-af09-b218aab27191/volumes" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.064383 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" event={"ID":"d8a71c32-05c6-4635-9e6a-3d72d59edd72","Type":"ContainerStarted","Data":"9c33867b6750f2b2be57b205a9e5bb3c1480513fa54d9f30c60fe290d1b06144"} Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.065312 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.089173 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" podStartSLOduration=4.08914779 podStartE2EDuration="4.08914779s" podCreationTimestamp="2025-10-02 11:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:51.087820322 +0000 UTC m=+1107.647727943" watchObservedRunningTime="2025-10-02 11:13:51.08914779 +0000 UTC m=+1107.649055371" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.177352 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6f51-account-create-7wtfl"] Oct 02 11:13:51 crc kubenswrapper[4835]: E1002 11:13:51.177720 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128cb0eb-73e9-4945-ab6e-212f20b7306a" containerName="init" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.177734 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="128cb0eb-73e9-4945-ab6e-212f20b7306a" containerName="init" Oct 02 11:13:51 crc kubenswrapper[4835]: E1002 11:13:51.177747 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b56f7fb-c3a1-4e13-af09-b218aab27191" containerName="init" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.177754 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b56f7fb-c3a1-4e13-af09-b218aab27191" containerName="init" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.178064 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="128cb0eb-73e9-4945-ab6e-212f20b7306a" containerName="init" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.178095 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b56f7fb-c3a1-4e13-af09-b218aab27191" containerName="init" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.178739 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f51-account-create-7wtfl" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.185680 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.188214 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6f51-account-create-7wtfl"] Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.306727 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lmdd\" (UniqueName: \"kubernetes.io/projected/53018b21-5abb-49d3-9098-041193565c81-kube-api-access-7lmdd\") pod \"barbican-6f51-account-create-7wtfl\" (UID: \"53018b21-5abb-49d3-9098-041193565c81\") " pod="openstack/barbican-6f51-account-create-7wtfl" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.374409 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-73f9-account-create-kgdkc"] Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.375494 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73f9-account-create-kgdkc" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.378623 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.398880 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-73f9-account-create-kgdkc"] Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.408029 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lmdd\" (UniqueName: \"kubernetes.io/projected/53018b21-5abb-49d3-9098-041193565c81-kube-api-access-7lmdd\") pod \"barbican-6f51-account-create-7wtfl\" (UID: \"53018b21-5abb-49d3-9098-041193565c81\") " pod="openstack/barbican-6f51-account-create-7wtfl" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.436846 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lmdd\" (UniqueName: \"kubernetes.io/projected/53018b21-5abb-49d3-9098-041193565c81-kube-api-access-7lmdd\") pod \"barbican-6f51-account-create-7wtfl\" (UID: \"53018b21-5abb-49d3-9098-041193565c81\") " pod="openstack/barbican-6f51-account-create-7wtfl" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.507656 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f51-account-create-7wtfl" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.509322 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g2kr\" (UniqueName: \"kubernetes.io/projected/492bc087-f2ed-4b40-8fe0-74accde085ce-kube-api-access-7g2kr\") pod \"neutron-73f9-account-create-kgdkc\" (UID: \"492bc087-f2ed-4b40-8fe0-74accde085ce\") " pod="openstack/neutron-73f9-account-create-kgdkc" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.610822 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g2kr\" (UniqueName: \"kubernetes.io/projected/492bc087-f2ed-4b40-8fe0-74accde085ce-kube-api-access-7g2kr\") pod \"neutron-73f9-account-create-kgdkc\" (UID: \"492bc087-f2ed-4b40-8fe0-74accde085ce\") " pod="openstack/neutron-73f9-account-create-kgdkc" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.629879 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g2kr\" (UniqueName: \"kubernetes.io/projected/492bc087-f2ed-4b40-8fe0-74accde085ce-kube-api-access-7g2kr\") pod \"neutron-73f9-account-create-kgdkc\" (UID: \"492bc087-f2ed-4b40-8fe0-74accde085ce\") " pod="openstack/neutron-73f9-account-create-kgdkc" Oct 02 11:13:51 crc kubenswrapper[4835]: I1002 11:13:51.695435 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73f9-account-create-kgdkc" Oct 02 11:13:52 crc kubenswrapper[4835]: I1002 11:13:52.081281 4835 generic.go:334] "Generic (PLEG): container finished" podID="87f0b083-e7cc-42c4-91f6-0d0fe4482a1e" containerID="78906bfb21ff2ad9bf11a249f1055013fb0b8cdc3afb343be8d537a89c9cc7ff" exitCode=0 Oct 02 11:13:52 crc kubenswrapper[4835]: I1002 11:13:52.081342 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4s5m7" event={"ID":"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e","Type":"ContainerDied","Data":"78906bfb21ff2ad9bf11a249f1055013fb0b8cdc3afb343be8d537a89c9cc7ff"} Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.516132 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.627202 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-combined-ca-bundle\") pod \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.627710 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-credential-keys\") pod \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.627888 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-config-data\") pod \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.628038 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-scripts\") pod \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.628239 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-fernet-keys\") pod \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.628399 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt2gb\" (UniqueName: \"kubernetes.io/projected/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-kube-api-access-mt2gb\") pod \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\" (UID: \"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e\") " Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.633734 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e" (UID: "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.633752 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-kube-api-access-mt2gb" (OuterVolumeSpecName: "kube-api-access-mt2gb") pod "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e" (UID: "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e"). InnerVolumeSpecName "kube-api-access-mt2gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.635979 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e" (UID: "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.635933 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-scripts" (OuterVolumeSpecName: "scripts") pod "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e" (UID: "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.660933 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e" (UID: "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.680340 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-config-data" (OuterVolumeSpecName: "config-data") pod "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e" (UID: "87f0b083-e7cc-42c4-91f6-0d0fe4482a1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.730721 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt2gb\" (UniqueName: \"kubernetes.io/projected/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-kube-api-access-mt2gb\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.730775 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.730792 4835 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.730805 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.730819 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:56 crc kubenswrapper[4835]: I1002 11:13:56.730831 4835 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.147619 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4s5m7" event={"ID":"87f0b083-e7cc-42c4-91f6-0d0fe4482a1e","Type":"ContainerDied","Data":"7515a1f07c43fbc8e3828b0aba056563d9af1a1dbc105768b0cbca0114028de5"} Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.147676 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7515a1f07c43fbc8e3828b0aba056563d9af1a1dbc105768b0cbca0114028de5" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.147704 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4s5m7" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.637838 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4s5m7"] Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.644471 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4s5m7"] Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.732876 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vg55h"] Oct 02 11:13:57 crc kubenswrapper[4835]: E1002 11:13:57.734123 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f0b083-e7cc-42c4-91f6-0d0fe4482a1e" containerName="keystone-bootstrap" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.734151 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0b083-e7cc-42c4-91f6-0d0fe4482a1e" containerName="keystone-bootstrap" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.734381 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f0b083-e7cc-42c4-91f6-0d0fe4482a1e" containerName="keystone-bootstrap" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.735114 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.737505 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.739715 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j5nt8" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.739909 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.740128 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.754685 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vg55h"] Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.791058 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.851619 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-cjkjv"] Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.851874 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" podUID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerName="dnsmasq-dns" containerID="cri-o://e9ba4d66278959154e7f401fd433752b90913cc2898571ab1ad680089f52f1cb" gracePeriod=10 Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.855294 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-fernet-keys\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.855345 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-credential-keys\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.855436 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-config-data\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.855453 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-combined-ca-bundle\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.855471 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-scripts\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.855509 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knxs5\" (UniqueName: \"kubernetes.io/projected/152bf752-1382-4613-84cb-a392f0666d6c-kube-api-access-knxs5\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.957193 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-credential-keys\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.957456 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-config-data\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.957480 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-combined-ca-bundle\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.957507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-scripts\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.957572 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knxs5\" (UniqueName: \"kubernetes.io/projected/152bf752-1382-4613-84cb-a392f0666d6c-kube-api-access-knxs5\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.957632 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-fernet-keys\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.963712 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-credential-keys\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.964366 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-scripts\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.964886 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-combined-ca-bundle\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.972546 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-config-data\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.987174 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knxs5\" (UniqueName: \"kubernetes.io/projected/152bf752-1382-4613-84cb-a392f0666d6c-kube-api-access-knxs5\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:57 crc kubenswrapper[4835]: I1002 11:13:57.994909 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-fernet-keys\") pod \"keystone-bootstrap-vg55h\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:58 crc kubenswrapper[4835]: I1002 11:13:58.105635 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:13:58 crc kubenswrapper[4835]: I1002 11:13:58.263986 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f0b083-e7cc-42c4-91f6-0d0fe4482a1e" path="/var/lib/kubelet/pods/87f0b083-e7cc-42c4-91f6-0d0fe4482a1e/volumes" Oct 02 11:13:58 crc kubenswrapper[4835]: I1002 11:13:58.314655 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" podUID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Oct 02 11:13:59 crc kubenswrapper[4835]: I1002 11:13:59.202655 4835 generic.go:334] "Generic (PLEG): container finished" podID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerID="e9ba4d66278959154e7f401fd433752b90913cc2898571ab1ad680089f52f1cb" exitCode=0 Oct 02 11:13:59 crc kubenswrapper[4835]: I1002 11:13:59.202722 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" event={"ID":"16ab9964-fa74-4216-a34d-37c0fc813a16","Type":"ContainerDied","Data":"e9ba4d66278959154e7f401fd433752b90913cc2898571ab1ad680089f52f1cb"} Oct 02 11:14:03 crc kubenswrapper[4835]: I1002 11:14:03.312108 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" podUID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Oct 02 11:14:08 crc kubenswrapper[4835]: I1002 11:14:08.312032 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" podUID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Oct 02 11:14:08 crc kubenswrapper[4835]: I1002 11:14:08.312602 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:14:13 crc kubenswrapper[4835]: I1002 11:14:13.312601 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" podUID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Oct 02 11:14:17 crc kubenswrapper[4835]: E1002 11:14:17.541834 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 02 11:14:17 crc kubenswrapper[4835]: E1002 11:14:17.542726 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4svdb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-2vdk6_openstack(fc534d7c-ef08-44e5-b56d-d3421477c51d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:14:17 crc kubenswrapper[4835]: E1002 11:14:17.544268 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-2vdk6" podUID="fc534d7c-ef08-44e5-b56d-d3421477c51d" Oct 02 11:14:17 crc kubenswrapper[4835]: I1002 11:14:17.954118 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.033085 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vg55h"] Oct 02 11:14:18 crc kubenswrapper[4835]: W1002 11:14:18.042648 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod152bf752_1382_4613_84cb_a392f0666d6c.slice/crio-e6f1cd73e6a5e6920c16f9f71dd411d5009f5bc358b4335d262790b910d0dea2 WatchSource:0}: Error finding container e6f1cd73e6a5e6920c16f9f71dd411d5009f5bc358b4335d262790b910d0dea2: Status 404 returned error can't find the container with id e6f1cd73e6a5e6920c16f9f71dd411d5009f5bc358b4335d262790b910d0dea2 Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.076818 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-nb\") pod \"16ab9964-fa74-4216-a34d-37c0fc813a16\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.076867 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-dns-svc\") pod \"16ab9964-fa74-4216-a34d-37c0fc813a16\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.076952 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-sb\") pod \"16ab9964-fa74-4216-a34d-37c0fc813a16\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.076974 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-config\") pod \"16ab9964-fa74-4216-a34d-37c0fc813a16\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.077079 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz5pn\" (UniqueName: \"kubernetes.io/projected/16ab9964-fa74-4216-a34d-37c0fc813a16-kube-api-access-vz5pn\") pod \"16ab9964-fa74-4216-a34d-37c0fc813a16\" (UID: \"16ab9964-fa74-4216-a34d-37c0fc813a16\") " Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.081928 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ab9964-fa74-4216-a34d-37c0fc813a16-kube-api-access-vz5pn" (OuterVolumeSpecName: "kube-api-access-vz5pn") pod "16ab9964-fa74-4216-a34d-37c0fc813a16" (UID: "16ab9964-fa74-4216-a34d-37c0fc813a16"). InnerVolumeSpecName "kube-api-access-vz5pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.126900 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "16ab9964-fa74-4216-a34d-37c0fc813a16" (UID: "16ab9964-fa74-4216-a34d-37c0fc813a16"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.135081 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-73f9-account-create-kgdkc"] Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.141878 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6f51-account-create-7wtfl"] Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.153823 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "16ab9964-fa74-4216-a34d-37c0fc813a16" (UID: "16ab9964-fa74-4216-a34d-37c0fc813a16"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.154471 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-config" (OuterVolumeSpecName: "config") pod "16ab9964-fa74-4216-a34d-37c0fc813a16" (UID: "16ab9964-fa74-4216-a34d-37c0fc813a16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.167215 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "16ab9964-fa74-4216-a34d-37c0fc813a16" (UID: "16ab9964-fa74-4216-a34d-37c0fc813a16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.179003 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.179299 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.179400 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.179481 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ab9964-fa74-4216-a34d-37c0fc813a16-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.179560 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz5pn\" (UniqueName: \"kubernetes.io/projected/16ab9964-fa74-4216-a34d-37c0fc813a16-kube-api-access-vz5pn\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.381358 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vg55h" event={"ID":"152bf752-1382-4613-84cb-a392f0666d6c","Type":"ContainerStarted","Data":"584b421f8a9321a47c9d94629bf8e2bd75fa8175eda32b2d55c3384b1fdf6f90"} Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.381700 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vg55h" event={"ID":"152bf752-1382-4613-84cb-a392f0666d6c","Type":"ContainerStarted","Data":"e6f1cd73e6a5e6920c16f9f71dd411d5009f5bc358b4335d262790b910d0dea2"} Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.384082 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6f51-account-create-7wtfl" event={"ID":"53018b21-5abb-49d3-9098-041193565c81","Type":"ContainerStarted","Data":"aeec7d539aee1c9aca026dcc0da91002669bb04ab1b72ce2177752885d3e5f88"} Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.384136 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6f51-account-create-7wtfl" event={"ID":"53018b21-5abb-49d3-9098-041193565c81","Type":"ContainerStarted","Data":"7a775bac00158852a3a37e48e1a2ba3c3c2aceb62494c844acc08c11e81a44cd"} Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.385645 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73f9-account-create-kgdkc" event={"ID":"492bc087-f2ed-4b40-8fe0-74accde085ce","Type":"ContainerStarted","Data":"0f1c4b5646ebc765b096f7a110db6b3d01992500b8df3696ba7e483900498bca"} Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.387264 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jwqkm" event={"ID":"01665ace-3f34-4029-a202-6f350e8497f9","Type":"ContainerStarted","Data":"cb8cf6922fd514e24e9fa8c0746f61209d84a55c0f311ae3fe3fd776f075bb5d"} Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.389652 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.389682 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-cjkjv" event={"ID":"16ab9964-fa74-4216-a34d-37c0fc813a16","Type":"ContainerDied","Data":"f36c3b7e302c054739fa86c57a0b1befc56c56e0e83669eeee58462a94669e1f"} Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.389774 4835 scope.go:117] "RemoveContainer" containerID="e9ba4d66278959154e7f401fd433752b90913cc2898571ab1ad680089f52f1cb" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.392030 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61fa3266-8b7b-45a8-a25f-e400fe1252f1","Type":"ContainerStarted","Data":"e30a4c7e371b0eeceea0598c139b528515c98bff1cbbf92a725014b76bc30a46"} Oct 02 11:14:18 crc kubenswrapper[4835]: E1002 11:14:18.393475 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-2vdk6" podUID="fc534d7c-ef08-44e5-b56d-d3421477c51d" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.414119 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jwqkm" podStartSLOduration=2.928054216 podStartE2EDuration="32.41409423s" podCreationTimestamp="2025-10-02 11:13:46 +0000 UTC" firstStartedPulling="2025-10-02 11:13:48.070447063 +0000 UTC m=+1104.630354644" lastFinishedPulling="2025-10-02 11:14:17.556487057 +0000 UTC m=+1134.116394658" observedRunningTime="2025-10-02 11:14:18.407422889 +0000 UTC m=+1134.967330500" watchObservedRunningTime="2025-10-02 11:14:18.41409423 +0000 UTC m=+1134.974001821" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.421961 4835 scope.go:117] "RemoveContainer" containerID="218ce625bcfca3e1dd61767139013b197951c21ba14f39c67e1cb361576de52f" Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.437793 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-cjkjv"] Oct 02 11:14:18 crc kubenswrapper[4835]: I1002 11:14:18.444643 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-cjkjv"] Oct 02 11:14:19 crc kubenswrapper[4835]: I1002 11:14:19.402294 4835 generic.go:334] "Generic (PLEG): container finished" podID="53018b21-5abb-49d3-9098-041193565c81" containerID="aeec7d539aee1c9aca026dcc0da91002669bb04ab1b72ce2177752885d3e5f88" exitCode=0 Oct 02 11:14:19 crc kubenswrapper[4835]: I1002 11:14:19.402354 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6f51-account-create-7wtfl" event={"ID":"53018b21-5abb-49d3-9098-041193565c81","Type":"ContainerDied","Data":"aeec7d539aee1c9aca026dcc0da91002669bb04ab1b72ce2177752885d3e5f88"} Oct 02 11:14:19 crc kubenswrapper[4835]: I1002 11:14:19.407542 4835 generic.go:334] "Generic (PLEG): container finished" podID="492bc087-f2ed-4b40-8fe0-74accde085ce" containerID="5dc34fc11def9148cac8bfdb3f4fd344006ef1894c9394c9b2ee6c62304082ef" exitCode=0 Oct 02 11:14:19 crc kubenswrapper[4835]: I1002 11:14:19.407845 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73f9-account-create-kgdkc" event={"ID":"492bc087-f2ed-4b40-8fe0-74accde085ce","Type":"ContainerDied","Data":"5dc34fc11def9148cac8bfdb3f4fd344006ef1894c9394c9b2ee6c62304082ef"} Oct 02 11:14:19 crc kubenswrapper[4835]: I1002 11:14:19.469237 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vg55h" podStartSLOduration=22.469194707 podStartE2EDuration="22.469194707s" podCreationTimestamp="2025-10-02 11:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:14:19.449097341 +0000 UTC m=+1136.009004922" watchObservedRunningTime="2025-10-02 11:14:19.469194707 +0000 UTC m=+1136.029102288" Oct 02 11:14:20 crc kubenswrapper[4835]: I1002 11:14:20.263550 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ab9964-fa74-4216-a34d-37c0fc813a16" path="/var/lib/kubelet/pods/16ab9964-fa74-4216-a34d-37c0fc813a16/volumes" Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.062523 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f51-account-create-7wtfl" Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.097160 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73f9-account-create-kgdkc" Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.206790 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lmdd\" (UniqueName: \"kubernetes.io/projected/53018b21-5abb-49d3-9098-041193565c81-kube-api-access-7lmdd\") pod \"53018b21-5abb-49d3-9098-041193565c81\" (UID: \"53018b21-5abb-49d3-9098-041193565c81\") " Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.207094 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g2kr\" (UniqueName: \"kubernetes.io/projected/492bc087-f2ed-4b40-8fe0-74accde085ce-kube-api-access-7g2kr\") pod \"492bc087-f2ed-4b40-8fe0-74accde085ce\" (UID: \"492bc087-f2ed-4b40-8fe0-74accde085ce\") " Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.220504 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/492bc087-f2ed-4b40-8fe0-74accde085ce-kube-api-access-7g2kr" (OuterVolumeSpecName: "kube-api-access-7g2kr") pod "492bc087-f2ed-4b40-8fe0-74accde085ce" (UID: "492bc087-f2ed-4b40-8fe0-74accde085ce"). InnerVolumeSpecName "kube-api-access-7g2kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.221050 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53018b21-5abb-49d3-9098-041193565c81-kube-api-access-7lmdd" (OuterVolumeSpecName: "kube-api-access-7lmdd") pod "53018b21-5abb-49d3-9098-041193565c81" (UID: "53018b21-5abb-49d3-9098-041193565c81"). InnerVolumeSpecName "kube-api-access-7lmdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.309298 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g2kr\" (UniqueName: \"kubernetes.io/projected/492bc087-f2ed-4b40-8fe0-74accde085ce-kube-api-access-7g2kr\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.309632 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lmdd\" (UniqueName: \"kubernetes.io/projected/53018b21-5abb-49d3-9098-041193565c81-kube-api-access-7lmdd\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.439864 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61fa3266-8b7b-45a8-a25f-e400fe1252f1","Type":"ContainerStarted","Data":"b954abdb37a949d39ae07b821bf43e1b89fbe4e1a51701fea7d60933c74cc4c1"} Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.441934 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6f51-account-create-7wtfl" event={"ID":"53018b21-5abb-49d3-9098-041193565c81","Type":"ContainerDied","Data":"7a775bac00158852a3a37e48e1a2ba3c3c2aceb62494c844acc08c11e81a44cd"} Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.441986 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a775bac00158852a3a37e48e1a2ba3c3c2aceb62494c844acc08c11e81a44cd" Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.442114 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f51-account-create-7wtfl" Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.443558 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73f9-account-create-kgdkc" event={"ID":"492bc087-f2ed-4b40-8fe0-74accde085ce","Type":"ContainerDied","Data":"0f1c4b5646ebc765b096f7a110db6b3d01992500b8df3696ba7e483900498bca"} Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.443604 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f1c4b5646ebc765b096f7a110db6b3d01992500b8df3696ba7e483900498bca" Oct 02 11:14:22 crc kubenswrapper[4835]: I1002 11:14:22.443725 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73f9-account-create-kgdkc" Oct 02 11:14:23 crc kubenswrapper[4835]: I1002 11:14:23.453400 4835 generic.go:334] "Generic (PLEG): container finished" podID="152bf752-1382-4613-84cb-a392f0666d6c" containerID="584b421f8a9321a47c9d94629bf8e2bd75fa8175eda32b2d55c3384b1fdf6f90" exitCode=0 Oct 02 11:14:23 crc kubenswrapper[4835]: I1002 11:14:23.453462 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vg55h" event={"ID":"152bf752-1382-4613-84cb-a392f0666d6c","Type":"ContainerDied","Data":"584b421f8a9321a47c9d94629bf8e2bd75fa8175eda32b2d55c3384b1fdf6f90"} Oct 02 11:14:25 crc kubenswrapper[4835]: I1002 11:14:25.862467 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.004030 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-fernet-keys\") pod \"152bf752-1382-4613-84cb-a392f0666d6c\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.004139 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-scripts\") pod \"152bf752-1382-4613-84cb-a392f0666d6c\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.004167 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knxs5\" (UniqueName: \"kubernetes.io/projected/152bf752-1382-4613-84cb-a392f0666d6c-kube-api-access-knxs5\") pod \"152bf752-1382-4613-84cb-a392f0666d6c\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.004195 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-combined-ca-bundle\") pod \"152bf752-1382-4613-84cb-a392f0666d6c\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.004296 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-config-data\") pod \"152bf752-1382-4613-84cb-a392f0666d6c\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.005059 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-credential-keys\") pod \"152bf752-1382-4613-84cb-a392f0666d6c\" (UID: \"152bf752-1382-4613-84cb-a392f0666d6c\") " Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.012423 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "152bf752-1382-4613-84cb-a392f0666d6c" (UID: "152bf752-1382-4613-84cb-a392f0666d6c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.012530 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/152bf752-1382-4613-84cb-a392f0666d6c-kube-api-access-knxs5" (OuterVolumeSpecName: "kube-api-access-knxs5") pod "152bf752-1382-4613-84cb-a392f0666d6c" (UID: "152bf752-1382-4613-84cb-a392f0666d6c"). InnerVolumeSpecName "kube-api-access-knxs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.012546 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "152bf752-1382-4613-84cb-a392f0666d6c" (UID: "152bf752-1382-4613-84cb-a392f0666d6c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.016355 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-scripts" (OuterVolumeSpecName: "scripts") pod "152bf752-1382-4613-84cb-a392f0666d6c" (UID: "152bf752-1382-4613-84cb-a392f0666d6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.033027 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "152bf752-1382-4613-84cb-a392f0666d6c" (UID: "152bf752-1382-4613-84cb-a392f0666d6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.036889 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-config-data" (OuterVolumeSpecName: "config-data") pod "152bf752-1382-4613-84cb-a392f0666d6c" (UID: "152bf752-1382-4613-84cb-a392f0666d6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.107468 4835 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.107542 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.107556 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knxs5\" (UniqueName: \"kubernetes.io/projected/152bf752-1382-4613-84cb-a392f0666d6c-kube-api-access-knxs5\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.107566 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.107577 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.107587 4835 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/152bf752-1382-4613-84cb-a392f0666d6c-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.515370 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vg55h" event={"ID":"152bf752-1382-4613-84cb-a392f0666d6c","Type":"ContainerDied","Data":"e6f1cd73e6a5e6920c16f9f71dd411d5009f5bc358b4335d262790b910d0dea2"} Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.515680 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6f1cd73e6a5e6920c16f9f71dd411d5009f5bc358b4335d262790b910d0dea2" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.515806 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vg55h" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.519961 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rk79x"] Oct 02 11:14:26 crc kubenswrapper[4835]: E1002 11:14:26.520427 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerName="dnsmasq-dns" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.520444 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerName="dnsmasq-dns" Oct 02 11:14:26 crc kubenswrapper[4835]: E1002 11:14:26.520466 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53018b21-5abb-49d3-9098-041193565c81" containerName="mariadb-account-create" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.520476 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="53018b21-5abb-49d3-9098-041193565c81" containerName="mariadb-account-create" Oct 02 11:14:26 crc kubenswrapper[4835]: E1002 11:14:26.520500 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerName="init" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.520508 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerName="init" Oct 02 11:14:26 crc kubenswrapper[4835]: E1002 11:14:26.520530 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152bf752-1382-4613-84cb-a392f0666d6c" containerName="keystone-bootstrap" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.520538 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="152bf752-1382-4613-84cb-a392f0666d6c" containerName="keystone-bootstrap" Oct 02 11:14:26 crc kubenswrapper[4835]: E1002 11:14:26.520557 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492bc087-f2ed-4b40-8fe0-74accde085ce" containerName="mariadb-account-create" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.520564 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="492bc087-f2ed-4b40-8fe0-74accde085ce" containerName="mariadb-account-create" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.520778 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="492bc087-f2ed-4b40-8fe0-74accde085ce" containerName="mariadb-account-create" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.520841 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="53018b21-5abb-49d3-9098-041193565c81" containerName="mariadb-account-create" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.520863 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="152bf752-1382-4613-84cb-a392f0666d6c" containerName="keystone-bootstrap" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.520887 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ab9964-fa74-4216-a34d-37c0fc813a16" containerName="dnsmasq-dns" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.521797 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rk79x" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.527143 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4fkhm" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.527424 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.547251 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rk79x"] Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.623624 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-db-sync-config-data\") pod \"barbican-db-sync-rk79x\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " pod="openstack/barbican-db-sync-rk79x" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.623808 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-combined-ca-bundle\") pod \"barbican-db-sync-rk79x\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " pod="openstack/barbican-db-sync-rk79x" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.623853 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t69tz\" (UniqueName: \"kubernetes.io/projected/e627031e-a0d4-459c-9250-bfdcf645d133-kube-api-access-t69tz\") pod \"barbican-db-sync-rk79x\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " pod="openstack/barbican-db-sync-rk79x" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.676258 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-rk9m4"] Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.679306 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.683729 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.683786 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.683916 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2nd9j" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.692594 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rk9m4"] Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.726577 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-combined-ca-bundle\") pod \"barbican-db-sync-rk79x\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " pod="openstack/barbican-db-sync-rk79x" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.726671 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t69tz\" (UniqueName: \"kubernetes.io/projected/e627031e-a0d4-459c-9250-bfdcf645d133-kube-api-access-t69tz\") pod \"barbican-db-sync-rk79x\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " pod="openstack/barbican-db-sync-rk79x" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.726727 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-db-sync-config-data\") pod \"barbican-db-sync-rk79x\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " pod="openstack/barbican-db-sync-rk79x" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.742053 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-combined-ca-bundle\") pod \"barbican-db-sync-rk79x\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " pod="openstack/barbican-db-sync-rk79x" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.744113 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-db-sync-config-data\") pod \"barbican-db-sync-rk79x\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " pod="openstack/barbican-db-sync-rk79x" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.764160 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t69tz\" (UniqueName: \"kubernetes.io/projected/e627031e-a0d4-459c-9250-bfdcf645d133-kube-api-access-t69tz\") pod \"barbican-db-sync-rk79x\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " pod="openstack/barbican-db-sync-rk79x" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.828727 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-combined-ca-bundle\") pod \"neutron-db-sync-rk9m4\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.828839 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnrrb\" (UniqueName: \"kubernetes.io/projected/af74a3af-19d5-45bf-b366-5e79fe901079-kube-api-access-qnrrb\") pod \"neutron-db-sync-rk9m4\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.828911 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-config\") pod \"neutron-db-sync-rk9m4\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.845959 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rk79x" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.930864 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnrrb\" (UniqueName: \"kubernetes.io/projected/af74a3af-19d5-45bf-b366-5e79fe901079-kube-api-access-qnrrb\") pod \"neutron-db-sync-rk9m4\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.931016 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-config\") pod \"neutron-db-sync-rk9m4\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.931092 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-combined-ca-bundle\") pod \"neutron-db-sync-rk9m4\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.939439 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-config\") pod \"neutron-db-sync-rk9m4\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.942457 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-combined-ca-bundle\") pod \"neutron-db-sync-rk9m4\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.952760 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnrrb\" (UniqueName: \"kubernetes.io/projected/af74a3af-19d5-45bf-b366-5e79fe901079-kube-api-access-qnrrb\") pod \"neutron-db-sync-rk9m4\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.991361 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-85675864fd-9krzl"] Oct 02 11:14:26 crc kubenswrapper[4835]: I1002 11:14:26.994704 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.002062 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.002153 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j5nt8" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.002161 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.002347 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.002534 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.004798 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.006870 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85675864fd-9krzl"] Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.041907 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.134809 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-fernet-keys\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.134863 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv6kv\" (UniqueName: \"kubernetes.io/projected/8753b414-73d0-489d-ab44-2a54891ba36b-kube-api-access-lv6kv\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.135054 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-public-tls-certs\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.135110 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-scripts\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.135207 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-config-data\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.135289 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-credential-keys\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.135418 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-combined-ca-bundle\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.135535 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-internal-tls-certs\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.236942 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-credential-keys\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.237021 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-combined-ca-bundle\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.237063 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-internal-tls-certs\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.237106 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-fernet-keys\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.237124 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv6kv\" (UniqueName: \"kubernetes.io/projected/8753b414-73d0-489d-ab44-2a54891ba36b-kube-api-access-lv6kv\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.237166 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-public-tls-certs\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.237185 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-scripts\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.237213 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-config-data\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.241912 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-fernet-keys\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.242031 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-public-tls-certs\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.243213 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-combined-ca-bundle\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.244537 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-config-data\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.244848 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-internal-tls-certs\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.245058 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-credential-keys\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.259259 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv6kv\" (UniqueName: \"kubernetes.io/projected/8753b414-73d0-489d-ab44-2a54891ba36b-kube-api-access-lv6kv\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.264394 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8753b414-73d0-489d-ab44-2a54891ba36b-scripts\") pod \"keystone-85675864fd-9krzl\" (UID: \"8753b414-73d0-489d-ab44-2a54891ba36b\") " pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:27 crc kubenswrapper[4835]: I1002 11:14:27.327043 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:35 crc kubenswrapper[4835]: E1002 11:14:35.187295 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Oct 02 11:14:35 crc kubenswrapper[4835]: E1002 11:14:35.188281 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fcms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(61fa3266-8b7b-45a8-a25f-e400fe1252f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:14:35 crc kubenswrapper[4835]: I1002 11:14:35.595355 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rk9m4"] Oct 02 11:14:35 crc kubenswrapper[4835]: I1002 11:14:35.606648 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85675864fd-9krzl"] Oct 02 11:14:35 crc kubenswrapper[4835]: W1002 11:14:35.613664 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf74a3af_19d5_45bf_b366_5e79fe901079.slice/crio-593980dde5525de1afd38338dfba9799b7cf1182bb75e45209ffd4d669e20f83 WatchSource:0}: Error finding container 593980dde5525de1afd38338dfba9799b7cf1182bb75e45209ffd4d669e20f83: Status 404 returned error can't find the container with id 593980dde5525de1afd38338dfba9799b7cf1182bb75e45209ffd4d669e20f83 Oct 02 11:14:35 crc kubenswrapper[4835]: I1002 11:14:35.716912 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rk79x"] Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.613843 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rk79x" event={"ID":"e627031e-a0d4-459c-9250-bfdcf645d133","Type":"ContainerStarted","Data":"c5a409133c595f798ef634ee581676581bf3fd63fd38629a70fdb53db55d4f82"} Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.616143 4835 generic.go:334] "Generic (PLEG): container finished" podID="01665ace-3f34-4029-a202-6f350e8497f9" containerID="cb8cf6922fd514e24e9fa8c0746f61209d84a55c0f311ae3fe3fd776f075bb5d" exitCode=0 Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.616207 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jwqkm" event={"ID":"01665ace-3f34-4029-a202-6f350e8497f9","Type":"ContainerDied","Data":"cb8cf6922fd514e24e9fa8c0746f61209d84a55c0f311ae3fe3fd776f075bb5d"} Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.618859 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2vdk6" event={"ID":"fc534d7c-ef08-44e5-b56d-d3421477c51d","Type":"ContainerStarted","Data":"cd596dd7792ef11c94784376d1552c2863357281364a72de1abc0804776a24a8"} Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.622187 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85675864fd-9krzl" event={"ID":"8753b414-73d0-489d-ab44-2a54891ba36b","Type":"ContainerStarted","Data":"23d67f638937a098aa8d735a505637c999ef2e4f05bc59e0a878969cb23a8a72"} Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.622262 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85675864fd-9krzl" event={"ID":"8753b414-73d0-489d-ab44-2a54891ba36b","Type":"ContainerStarted","Data":"fa5c5e13fa3c07348406de5e031de6e19026d8ecc3f2e347f56ba41eccd1a3e9"} Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.622372 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.626793 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rk9m4" event={"ID":"af74a3af-19d5-45bf-b366-5e79fe901079","Type":"ContainerStarted","Data":"c1d1c1bc0be760576a311187ca4f80eeec19da3191c231e3a0a2983e09e1f0ea"} Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.626845 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rk9m4" event={"ID":"af74a3af-19d5-45bf-b366-5e79fe901079","Type":"ContainerStarted","Data":"593980dde5525de1afd38338dfba9799b7cf1182bb75e45209ffd4d669e20f83"} Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.660185 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-2vdk6" podStartSLOduration=2.871270528 podStartE2EDuration="50.660161204s" podCreationTimestamp="2025-10-02 11:13:46 +0000 UTC" firstStartedPulling="2025-10-02 11:13:47.548811834 +0000 UTC m=+1104.108719415" lastFinishedPulling="2025-10-02 11:14:35.33770251 +0000 UTC m=+1151.897610091" observedRunningTime="2025-10-02 11:14:36.658176217 +0000 UTC m=+1153.218083798" watchObservedRunningTime="2025-10-02 11:14:36.660161204 +0000 UTC m=+1153.220068785" Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.686585 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-85675864fd-9krzl" podStartSLOduration=10.686561861 podStartE2EDuration="10.686561861s" podCreationTimestamp="2025-10-02 11:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:14:36.682728311 +0000 UTC m=+1153.242635892" watchObservedRunningTime="2025-10-02 11:14:36.686561861 +0000 UTC m=+1153.246469442" Oct 02 11:14:36 crc kubenswrapper[4835]: I1002 11:14:36.711052 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-rk9m4" podStartSLOduration=10.711030783 podStartE2EDuration="10.711030783s" podCreationTimestamp="2025-10-02 11:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:14:36.702813727 +0000 UTC m=+1153.262721308" watchObservedRunningTime="2025-10-02 11:14:36.711030783 +0000 UTC m=+1153.270938364" Oct 02 11:14:39 crc kubenswrapper[4835]: I1002 11:14:39.986314 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jwqkm" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.124866 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czplq\" (UniqueName: \"kubernetes.io/projected/01665ace-3f34-4029-a202-6f350e8497f9-kube-api-access-czplq\") pod \"01665ace-3f34-4029-a202-6f350e8497f9\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.124991 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-scripts\") pod \"01665ace-3f34-4029-a202-6f350e8497f9\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.125020 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01665ace-3f34-4029-a202-6f350e8497f9-logs\") pod \"01665ace-3f34-4029-a202-6f350e8497f9\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.125037 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-combined-ca-bundle\") pod \"01665ace-3f34-4029-a202-6f350e8497f9\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.125061 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-config-data\") pod \"01665ace-3f34-4029-a202-6f350e8497f9\" (UID: \"01665ace-3f34-4029-a202-6f350e8497f9\") " Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.125621 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01665ace-3f34-4029-a202-6f350e8497f9-logs" (OuterVolumeSpecName: "logs") pod "01665ace-3f34-4029-a202-6f350e8497f9" (UID: "01665ace-3f34-4029-a202-6f350e8497f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.131420 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01665ace-3f34-4029-a202-6f350e8497f9-kube-api-access-czplq" (OuterVolumeSpecName: "kube-api-access-czplq") pod "01665ace-3f34-4029-a202-6f350e8497f9" (UID: "01665ace-3f34-4029-a202-6f350e8497f9"). InnerVolumeSpecName "kube-api-access-czplq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.132955 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-scripts" (OuterVolumeSpecName: "scripts") pod "01665ace-3f34-4029-a202-6f350e8497f9" (UID: "01665ace-3f34-4029-a202-6f350e8497f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.149984 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-config-data" (OuterVolumeSpecName: "config-data") pod "01665ace-3f34-4029-a202-6f350e8497f9" (UID: "01665ace-3f34-4029-a202-6f350e8497f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.152391 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01665ace-3f34-4029-a202-6f350e8497f9" (UID: "01665ace-3f34-4029-a202-6f350e8497f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.226916 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.226967 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01665ace-3f34-4029-a202-6f350e8497f9-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.226980 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.226995 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01665ace-3f34-4029-a202-6f350e8497f9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.227008 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czplq\" (UniqueName: \"kubernetes.io/projected/01665ace-3f34-4029-a202-6f350e8497f9-kube-api-access-czplq\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.666555 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jwqkm" event={"ID":"01665ace-3f34-4029-a202-6f350e8497f9","Type":"ContainerDied","Data":"61f7c0f8e2d4024320d6ec93e9e1fecd7b158472a8e9477be491493b32f82854"} Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.666620 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61f7c0f8e2d4024320d6ec93e9e1fecd7b158472a8e9477be491493b32f82854" Oct 02 11:14:40 crc kubenswrapper[4835]: I1002 11:14:40.666652 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jwqkm" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.104347 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7cfb46b6c6-x7hzh"] Oct 02 11:14:41 crc kubenswrapper[4835]: E1002 11:14:41.105037 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01665ace-3f34-4029-a202-6f350e8497f9" containerName="placement-db-sync" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.105068 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="01665ace-3f34-4029-a202-6f350e8497f9" containerName="placement-db-sync" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.105335 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="01665ace-3f34-4029-a202-6f350e8497f9" containerName="placement-db-sync" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.106521 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.108922 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.109320 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8vv6j" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.109499 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.109598 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.109780 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.134495 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cfb46b6c6-x7hzh"] Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.244317 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-logs\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.244389 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-public-tls-certs\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.244426 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-combined-ca-bundle\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.244446 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-internal-tls-certs\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.244520 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-config-data\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.244564 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-scripts\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.244601 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdjnm\" (UniqueName: \"kubernetes.io/projected/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-kube-api-access-xdjnm\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.346469 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-scripts\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.346596 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdjnm\" (UniqueName: \"kubernetes.io/projected/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-kube-api-access-xdjnm\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.346635 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-logs\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.346685 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-public-tls-certs\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.346755 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-combined-ca-bundle\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.346793 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-internal-tls-certs\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.346988 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-config-data\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.348533 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-logs\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.352186 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-internal-tls-certs\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.352867 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-config-data\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.353400 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-combined-ca-bundle\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.354500 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-scripts\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.356113 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-public-tls-certs\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.366195 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdjnm\" (UniqueName: \"kubernetes.io/projected/f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def-kube-api-access-xdjnm\") pod \"placement-7cfb46b6c6-x7hzh\" (UID: \"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def\") " pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:41 crc kubenswrapper[4835]: I1002 11:14:41.428271 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:49 crc kubenswrapper[4835]: E1002 11:14:49.605939 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Oct 02 11:14:49 crc kubenswrapper[4835]: E1002 11:14:49.606594 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fcms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(61fa3266-8b7b-45a8-a25f-e400fe1252f1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:14:49 crc kubenswrapper[4835]: E1002 11:14:49.608305 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" Oct 02 11:14:49 crc kubenswrapper[4835]: I1002 11:14:49.749283 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" containerName="ceilometer-central-agent" containerID="cri-o://e30a4c7e371b0eeceea0598c139b528515c98bff1cbbf92a725014b76bc30a46" gracePeriod=30 Oct 02 11:14:49 crc kubenswrapper[4835]: I1002 11:14:49.749930 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" containerName="ceilometer-notification-agent" containerID="cri-o://b954abdb37a949d39ae07b821bf43e1b89fbe4e1a51701fea7d60933c74cc4c1" gracePeriod=30 Oct 02 11:14:50 crc kubenswrapper[4835]: I1002 11:14:50.049172 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cfb46b6c6-x7hzh"] Oct 02 11:14:50 crc kubenswrapper[4835]: I1002 11:14:50.777159 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rk79x" event={"ID":"e627031e-a0d4-459c-9250-bfdcf645d133","Type":"ContainerStarted","Data":"9ee76de918bc4e19b8144d3bd91d16557d815e1f1a8213cead7b8641ab72641f"} Oct 02 11:14:50 crc kubenswrapper[4835]: I1002 11:14:50.785762 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cfb46b6c6-x7hzh" event={"ID":"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def","Type":"ContainerStarted","Data":"d05e556cb0cec46ce32c18833999e8c8a5c697d0f2f0362ab178a0fef023faa0"} Oct 02 11:14:50 crc kubenswrapper[4835]: I1002 11:14:50.785813 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cfb46b6c6-x7hzh" event={"ID":"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def","Type":"ContainerStarted","Data":"b8d8a4a8ac18ec882f8e716acc826419a3bccf2b889dc6a4f3305f5c46759b08"} Oct 02 11:14:50 crc kubenswrapper[4835]: I1002 11:14:50.785824 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cfb46b6c6-x7hzh" event={"ID":"f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def","Type":"ContainerStarted","Data":"70c99ef05c2cfc26eaec031856a63e2739b3f1dfdbcb0dbf0f7ea1c03e701e5a"} Oct 02 11:14:50 crc kubenswrapper[4835]: I1002 11:14:50.786277 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:50 crc kubenswrapper[4835]: I1002 11:14:50.786393 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:14:50 crc kubenswrapper[4835]: I1002 11:14:50.788932 4835 generic.go:334] "Generic (PLEG): container finished" podID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" containerID="e30a4c7e371b0eeceea0598c139b528515c98bff1cbbf92a725014b76bc30a46" exitCode=0 Oct 02 11:14:50 crc kubenswrapper[4835]: I1002 11:14:50.789002 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61fa3266-8b7b-45a8-a25f-e400fe1252f1","Type":"ContainerDied","Data":"e30a4c7e371b0eeceea0598c139b528515c98bff1cbbf92a725014b76bc30a46"} Oct 02 11:14:50 crc kubenswrapper[4835]: I1002 11:14:50.807086 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rk79x" podStartSLOduration=10.940121572 podStartE2EDuration="24.807061334s" podCreationTimestamp="2025-10-02 11:14:26 +0000 UTC" firstStartedPulling="2025-10-02 11:14:35.734435087 +0000 UTC m=+1152.294342668" lastFinishedPulling="2025-10-02 11:14:49.601374849 +0000 UTC m=+1166.161282430" observedRunningTime="2025-10-02 11:14:50.797066178 +0000 UTC m=+1167.356973759" watchObservedRunningTime="2025-10-02 11:14:50.807061334 +0000 UTC m=+1167.366968915" Oct 02 11:14:50 crc kubenswrapper[4835]: I1002 11:14:50.831124 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7cfb46b6c6-x7hzh" podStartSLOduration=9.831094173 podStartE2EDuration="9.831094173s" podCreationTimestamp="2025-10-02 11:14:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:14:50.823125265 +0000 UTC m=+1167.383032846" watchObservedRunningTime="2025-10-02 11:14:50.831094173 +0000 UTC m=+1167.391001754" Oct 02 11:14:53 crc kubenswrapper[4835]: I1002 11:14:53.827643 4835 generic.go:334] "Generic (PLEG): container finished" podID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" containerID="b954abdb37a949d39ae07b821bf43e1b89fbe4e1a51701fea7d60933c74cc4c1" exitCode=0 Oct 02 11:14:53 crc kubenswrapper[4835]: I1002 11:14:53.827723 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61fa3266-8b7b-45a8-a25f-e400fe1252f1","Type":"ContainerDied","Data":"b954abdb37a949d39ae07b821bf43e1b89fbe4e1a51701fea7d60933c74cc4c1"} Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.278494 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.413492 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-config-data\") pod \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.414675 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-scripts\") pod \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.414723 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-log-httpd\") pod \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.414809 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-combined-ca-bundle\") pod \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.415163 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fcms\" (UniqueName: \"kubernetes.io/projected/61fa3266-8b7b-45a8-a25f-e400fe1252f1-kube-api-access-2fcms\") pod \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.415312 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-sg-core-conf-yaml\") pod \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.415367 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-run-httpd\") pod \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\" (UID: \"61fa3266-8b7b-45a8-a25f-e400fe1252f1\") " Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.415549 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "61fa3266-8b7b-45a8-a25f-e400fe1252f1" (UID: "61fa3266-8b7b-45a8-a25f-e400fe1252f1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.415902 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "61fa3266-8b7b-45a8-a25f-e400fe1252f1" (UID: "61fa3266-8b7b-45a8-a25f-e400fe1252f1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.416560 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.416587 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61fa3266-8b7b-45a8-a25f-e400fe1252f1-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.428465 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "61fa3266-8b7b-45a8-a25f-e400fe1252f1" (UID: "61fa3266-8b7b-45a8-a25f-e400fe1252f1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.435737 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61fa3266-8b7b-45a8-a25f-e400fe1252f1-kube-api-access-2fcms" (OuterVolumeSpecName: "kube-api-access-2fcms") pod "61fa3266-8b7b-45a8-a25f-e400fe1252f1" (UID: "61fa3266-8b7b-45a8-a25f-e400fe1252f1"). InnerVolumeSpecName "kube-api-access-2fcms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.447637 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-scripts" (OuterVolumeSpecName: "scripts") pod "61fa3266-8b7b-45a8-a25f-e400fe1252f1" (UID: "61fa3266-8b7b-45a8-a25f-e400fe1252f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.488801 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61fa3266-8b7b-45a8-a25f-e400fe1252f1" (UID: "61fa3266-8b7b-45a8-a25f-e400fe1252f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.490334 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-config-data" (OuterVolumeSpecName: "config-data") pod "61fa3266-8b7b-45a8-a25f-e400fe1252f1" (UID: "61fa3266-8b7b-45a8-a25f-e400fe1252f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.518197 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.518251 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fcms\" (UniqueName: \"kubernetes.io/projected/61fa3266-8b7b-45a8-a25f-e400fe1252f1-kube-api-access-2fcms\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.518268 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.518282 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.518292 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61fa3266-8b7b-45a8-a25f-e400fe1252f1-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.842926 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61fa3266-8b7b-45a8-a25f-e400fe1252f1","Type":"ContainerDied","Data":"b3bb92927e413932e542ec7378c79783106619e77c7f99d73a30c9c02e61b516"} Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.843013 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.843018 4835 scope.go:117] "RemoveContainer" containerID="b954abdb37a949d39ae07b821bf43e1b89fbe4e1a51701fea7d60933c74cc4c1" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.874983 4835 scope.go:117] "RemoveContainer" containerID="e30a4c7e371b0eeceea0598c139b528515c98bff1cbbf92a725014b76bc30a46" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.917282 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.947662 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.955330 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:14:54 crc kubenswrapper[4835]: E1002 11:14:54.955898 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" containerName="ceilometer-central-agent" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.955929 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" containerName="ceilometer-central-agent" Oct 02 11:14:54 crc kubenswrapper[4835]: E1002 11:14:54.955949 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" containerName="ceilometer-notification-agent" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.955958 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" containerName="ceilometer-notification-agent" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.956872 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" containerName="ceilometer-notification-agent" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.956922 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" containerName="ceilometer-central-agent" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.960693 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.963926 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.964356 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:14:54 crc kubenswrapper[4835]: I1002 11:14:54.987084 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.130202 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-scripts\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.130301 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-log-httpd\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.130335 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-run-httpd\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.130426 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.130475 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j2hx\" (UniqueName: \"kubernetes.io/projected/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-kube-api-access-9j2hx\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.130509 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-config-data\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.130530 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.232196 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.232375 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j2hx\" (UniqueName: \"kubernetes.io/projected/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-kube-api-access-9j2hx\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.232423 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-config-data\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.232455 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.232530 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-scripts\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.232601 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-log-httpd\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.232642 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-run-httpd\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.233200 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-run-httpd\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.233834 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-log-httpd\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.238059 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-config-data\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.238263 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-scripts\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.239029 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.240175 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.255013 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j2hx\" (UniqueName: \"kubernetes.io/projected/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-kube-api-access-9j2hx\") pod \"ceilometer-0\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.302422 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.559287 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:14:55 crc kubenswrapper[4835]: I1002 11:14:55.856567 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7","Type":"ContainerStarted","Data":"2497d86137fbad3d073a9f7a2e30bcc7181c495342e6bfc8f7e706be609cdf22"} Oct 02 11:14:56 crc kubenswrapper[4835]: I1002 11:14:56.272266 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61fa3266-8b7b-45a8-a25f-e400fe1252f1" path="/var/lib/kubelet/pods/61fa3266-8b7b-45a8-a25f-e400fe1252f1/volumes" Oct 02 11:14:57 crc kubenswrapper[4835]: I1002 11:14:57.883710 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7","Type":"ContainerStarted","Data":"ebb345f1b91ce7c17307535e83371a5ffa342c531128485a73f1f6ab04b1c01c"} Oct 02 11:14:58 crc kubenswrapper[4835]: I1002 11:14:58.982019 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-85675864fd-9krzl" Oct 02 11:14:59 crc kubenswrapper[4835]: I1002 11:14:59.907176 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7","Type":"ContainerStarted","Data":"d1a88fa97c5d0a6cb7a9b0313a67615f960cdba5f2c29016bae59a20fc2408e4"} Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.144917 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.154561 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.154678 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.161172 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.161396 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-pszs8" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.161546 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.233533 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9d88\" (UniqueName: \"kubernetes.io/projected/1ea29c73-5a74-4943-94c9-78c3d3c01c06-kube-api-access-g9d88\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.233584 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.233616 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config-secret\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.233683 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.239457 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24"] Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.240721 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.243708 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.243995 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.334980 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9d88\" (UniqueName: \"kubernetes.io/projected/1ea29c73-5a74-4943-94c9-78c3d3c01c06-kube-api-access-g9d88\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.335303 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.335405 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config-secret\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.335553 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.438408 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqbzl\" (UniqueName: \"kubernetes.io/projected/2905629f-e865-4e05-a222-a84e1fa0b88a-kube-api-access-cqbzl\") pod \"collect-profiles-29323395-cdw24\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.438766 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2905629f-e865-4e05-a222-a84e1fa0b88a-secret-volume\") pod \"collect-profiles-29323395-cdw24\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.438872 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2905629f-e865-4e05-a222-a84e1fa0b88a-config-volume\") pod \"collect-profiles-29323395-cdw24\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.540699 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2905629f-e865-4e05-a222-a84e1fa0b88a-secret-volume\") pod \"collect-profiles-29323395-cdw24\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.540789 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2905629f-e865-4e05-a222-a84e1fa0b88a-config-volume\") pod \"collect-profiles-29323395-cdw24\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.541012 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqbzl\" (UniqueName: \"kubernetes.io/projected/2905629f-e865-4e05-a222-a84e1fa0b88a-kube-api-access-cqbzl\") pod \"collect-profiles-29323395-cdw24\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.546117 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24"] Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.546196 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.546214 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.546251 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.547081 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2905629f-e865-4e05-a222-a84e1fa0b88a-config-volume\") pod \"collect-profiles-29323395-cdw24\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.547084 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.547405 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:00 crc kubenswrapper[4835]: E1002 11:15:00.547405 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-g9d88 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="1ea29c73-5a74-4943-94c9-78c3d3c01c06" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.547505 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.554189 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.554705 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config-secret\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.570769 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2905629f-e865-4e05-a222-a84e1fa0b88a-secret-volume\") pod \"collect-profiles-29323395-cdw24\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.572465 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9d88\" (UniqueName: \"kubernetes.io/projected/1ea29c73-5a74-4943-94c9-78c3d3c01c06-kube-api-access-g9d88\") pod \"openstackclient\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.573592 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqbzl\" (UniqueName: \"kubernetes.io/projected/2905629f-e865-4e05-a222-a84e1fa0b88a-kube-api-access-cqbzl\") pod \"collect-profiles-29323395-cdw24\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.642837 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/748cf871-b4a4-418b-9c72-c2c21e1f85ad-openstack-config\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.645134 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/748cf871-b4a4-418b-9c72-c2c21e1f85ad-openstack-config-secret\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.645325 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/748cf871-b4a4-418b-9c72-c2c21e1f85ad-combined-ca-bundle\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.645451 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks9gc\" (UniqueName: \"kubernetes.io/projected/748cf871-b4a4-418b-9c72-c2c21e1f85ad-kube-api-access-ks9gc\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.747283 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/748cf871-b4a4-418b-9c72-c2c21e1f85ad-openstack-config\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.747335 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/748cf871-b4a4-418b-9c72-c2c21e1f85ad-openstack-config-secret\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.747403 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/748cf871-b4a4-418b-9c72-c2c21e1f85ad-combined-ca-bundle\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.747450 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks9gc\" (UniqueName: \"kubernetes.io/projected/748cf871-b4a4-418b-9c72-c2c21e1f85ad-kube-api-access-ks9gc\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.846989 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.862912 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/748cf871-b4a4-418b-9c72-c2c21e1f85ad-openstack-config\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.869122 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/748cf871-b4a4-418b-9c72-c2c21e1f85ad-openstack-config-secret\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.869143 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks9gc\" (UniqueName: \"kubernetes.io/projected/748cf871-b4a4-418b-9c72-c2c21e1f85ad-kube-api-access-ks9gc\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.869823 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/748cf871-b4a4-418b-9c72-c2c21e1f85ad-combined-ca-bundle\") pod \"openstackclient\" (UID: \"748cf871-b4a4-418b-9c72-c2c21e1f85ad\") " pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.915247 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.921323 4835 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1ea29c73-5a74-4943-94c9-78c3d3c01c06" podUID="748cf871-b4a4-418b-9c72-c2c21e1f85ad" Oct 02 11:15:00 crc kubenswrapper[4835]: I1002 11:15:00.953816 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.076587 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config-secret\") pod \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.078679 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9d88\" (UniqueName: \"kubernetes.io/projected/1ea29c73-5a74-4943-94c9-78c3d3c01c06-kube-api-access-g9d88\") pod \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.079977 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config\") pod \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.080192 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-combined-ca-bundle\") pod \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\" (UID: \"1ea29c73-5a74-4943-94c9-78c3d3c01c06\") " Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.088106 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1ea29c73-5a74-4943-94c9-78c3d3c01c06" (UID: "1ea29c73-5a74-4943-94c9-78c3d3c01c06"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.089699 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ea29c73-5a74-4943-94c9-78c3d3c01c06" (UID: "1ea29c73-5a74-4943-94c9-78c3d3c01c06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.112311 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ea29c73-5a74-4943-94c9-78c3d3c01c06-kube-api-access-g9d88" (OuterVolumeSpecName: "kube-api-access-g9d88") pod "1ea29c73-5a74-4943-94c9-78c3d3c01c06" (UID: "1ea29c73-5a74-4943-94c9-78c3d3c01c06"). InnerVolumeSpecName "kube-api-access-g9d88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.161434 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.182323 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.182382 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9d88\" (UniqueName: \"kubernetes.io/projected/1ea29c73-5a74-4943-94c9-78c3d3c01c06-kube-api-access-g9d88\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.182427 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea29c73-5a74-4943-94c9-78c3d3c01c06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.362880 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24"] Oct 02 11:15:01 crc kubenswrapper[4835]: W1002 11:15:01.372590 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2905629f_e865_4e05_a222_a84e1fa0b88a.slice/crio-b61d1fb60f0b37892b48026d0fa8b3edec05647fa326c06765ea838dd72dc954 WatchSource:0}: Error finding container b61d1fb60f0b37892b48026d0fa8b3edec05647fa326c06765ea838dd72dc954: Status 404 returned error can't find the container with id b61d1fb60f0b37892b48026d0fa8b3edec05647fa326c06765ea838dd72dc954 Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.603510 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1ea29c73-5a74-4943-94c9-78c3d3c01c06" (UID: "1ea29c73-5a74-4943-94c9-78c3d3c01c06"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.637234 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:01 crc kubenswrapper[4835]: W1002 11:15:01.651513 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod748cf871_b4a4_418b_9c72_c2c21e1f85ad.slice/crio-d4044d0db6a099664c04c8459f6dfbbb2648200d1d86cd885fd2f20d7c35fb5d WatchSource:0}: Error finding container d4044d0db6a099664c04c8459f6dfbbb2648200d1d86cd885fd2f20d7c35fb5d: Status 404 returned error can't find the container with id d4044d0db6a099664c04c8459f6dfbbb2648200d1d86cd885fd2f20d7c35fb5d Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.693194 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1ea29c73-5a74-4943-94c9-78c3d3c01c06-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.924309 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" event={"ID":"2905629f-e865-4e05-a222-a84e1fa0b88a","Type":"ContainerStarted","Data":"b61d1fb60f0b37892b48026d0fa8b3edec05647fa326c06765ea838dd72dc954"} Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.925937 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"748cf871-b4a4-418b-9c72-c2c21e1f85ad","Type":"ContainerStarted","Data":"d4044d0db6a099664c04c8459f6dfbbb2648200d1d86cd885fd2f20d7c35fb5d"} Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.926202 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:01 crc kubenswrapper[4835]: I1002 11:15:01.933145 4835 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1ea29c73-5a74-4943-94c9-78c3d3c01c06" podUID="748cf871-b4a4-418b-9c72-c2c21e1f85ad" Oct 02 11:15:02 crc kubenswrapper[4835]: I1002 11:15:02.264176 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ea29c73-5a74-4943-94c9-78c3d3c01c06" path="/var/lib/kubelet/pods/1ea29c73-5a74-4943-94c9-78c3d3c01c06/volumes" Oct 02 11:15:03 crc kubenswrapper[4835]: I1002 11:15:03.942366 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" event={"ID":"2905629f-e865-4e05-a222-a84e1fa0b88a","Type":"ContainerStarted","Data":"e9904b51882f3055edd07d5c0bde074ccce42910ec6e682b9f1ed535ee701864"} Oct 02 11:15:04 crc kubenswrapper[4835]: I1002 11:15:04.962348 4835 generic.go:334] "Generic (PLEG): container finished" podID="2905629f-e865-4e05-a222-a84e1fa0b88a" containerID="e9904b51882f3055edd07d5c0bde074ccce42910ec6e682b9f1ed535ee701864" exitCode=0 Oct 02 11:15:04 crc kubenswrapper[4835]: I1002 11:15:04.963009 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" event={"ID":"2905629f-e865-4e05-a222-a84e1fa0b88a","Type":"ContainerDied","Data":"e9904b51882f3055edd07d5c0bde074ccce42910ec6e682b9f1ed535ee701864"} Oct 02 11:15:04 crc kubenswrapper[4835]: I1002 11:15:04.969219 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7","Type":"ContainerStarted","Data":"a7a69a8916056cf4990fd1f397df905fafa698d5942683a1009862bcc93a5e45"} Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.390326 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.524607 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqbzl\" (UniqueName: \"kubernetes.io/projected/2905629f-e865-4e05-a222-a84e1fa0b88a-kube-api-access-cqbzl\") pod \"2905629f-e865-4e05-a222-a84e1fa0b88a\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.524699 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2905629f-e865-4e05-a222-a84e1fa0b88a-config-volume\") pod \"2905629f-e865-4e05-a222-a84e1fa0b88a\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.525052 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2905629f-e865-4e05-a222-a84e1fa0b88a-secret-volume\") pod \"2905629f-e865-4e05-a222-a84e1fa0b88a\" (UID: \"2905629f-e865-4e05-a222-a84e1fa0b88a\") " Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.525477 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2905629f-e865-4e05-a222-a84e1fa0b88a-config-volume" (OuterVolumeSpecName: "config-volume") pod "2905629f-e865-4e05-a222-a84e1fa0b88a" (UID: "2905629f-e865-4e05-a222-a84e1fa0b88a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.526053 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2905629f-e865-4e05-a222-a84e1fa0b88a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.543516 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2905629f-e865-4e05-a222-a84e1fa0b88a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2905629f-e865-4e05-a222-a84e1fa0b88a" (UID: "2905629f-e865-4e05-a222-a84e1fa0b88a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.545509 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2905629f-e865-4e05-a222-a84e1fa0b88a-kube-api-access-cqbzl" (OuterVolumeSpecName: "kube-api-access-cqbzl") pod "2905629f-e865-4e05-a222-a84e1fa0b88a" (UID: "2905629f-e865-4e05-a222-a84e1fa0b88a"). InnerVolumeSpecName "kube-api-access-cqbzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.627503 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2905629f-e865-4e05-a222-a84e1fa0b88a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.627553 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqbzl\" (UniqueName: \"kubernetes.io/projected/2905629f-e865-4e05-a222-a84e1fa0b88a-kube-api-access-cqbzl\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.989583 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" event={"ID":"2905629f-e865-4e05-a222-a84e1fa0b88a","Type":"ContainerDied","Data":"b61d1fb60f0b37892b48026d0fa8b3edec05647fa326c06765ea838dd72dc954"} Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.989646 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b61d1fb60f0b37892b48026d0fa8b3edec05647fa326c06765ea838dd72dc954" Oct 02 11:15:06 crc kubenswrapper[4835]: I1002 11:15:06.989614 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24" Oct 02 11:15:08 crc kubenswrapper[4835]: I1002 11:15:08.004957 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7","Type":"ContainerStarted","Data":"796f67033b0c50666606049060c308e2500420b4c29cc42a982cabf8c25466ac"} Oct 02 11:15:08 crc kubenswrapper[4835]: I1002 11:15:08.005133 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:15:08 crc kubenswrapper[4835]: I1002 11:15:08.046974 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.233335514 podStartE2EDuration="14.046951094s" podCreationTimestamp="2025-10-02 11:14:54 +0000 UTC" firstStartedPulling="2025-10-02 11:14:55.567645995 +0000 UTC m=+1172.127553576" lastFinishedPulling="2025-10-02 11:15:07.381261575 +0000 UTC m=+1183.941169156" observedRunningTime="2025-10-02 11:15:08.043808174 +0000 UTC m=+1184.603715765" watchObservedRunningTime="2025-10-02 11:15:08.046951094 +0000 UTC m=+1184.606858675" Oct 02 11:15:16 crc kubenswrapper[4835]: I1002 11:15:16.273801 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:15:16 crc kubenswrapper[4835]: I1002 11:15:16.381775 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cfb46b6c6-x7hzh" Oct 02 11:15:18 crc kubenswrapper[4835]: I1002 11:15:18.094765 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"748cf871-b4a4-418b-9c72-c2c21e1f85ad","Type":"ContainerStarted","Data":"39c38bd218cafd96cb7fbc11b159809dcde6218ec555b34e5d8c3c04e0f0fc50"} Oct 02 11:15:18 crc kubenswrapper[4835]: I1002 11:15:18.123003 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.270019448 podStartE2EDuration="18.122982185s" podCreationTimestamp="2025-10-02 11:15:00 +0000 UTC" firstStartedPulling="2025-10-02 11:15:01.654405635 +0000 UTC m=+1178.214313216" lastFinishedPulling="2025-10-02 11:15:17.507368372 +0000 UTC m=+1194.067275953" observedRunningTime="2025-10-02 11:15:18.115706107 +0000 UTC m=+1194.675613688" watchObservedRunningTime="2025-10-02 11:15:18.122982185 +0000 UTC m=+1194.682889776" Oct 02 11:15:20 crc kubenswrapper[4835]: I1002 11:15:20.115632 4835 generic.go:334] "Generic (PLEG): container finished" podID="e627031e-a0d4-459c-9250-bfdcf645d133" containerID="9ee76de918bc4e19b8144d3bd91d16557d815e1f1a8213cead7b8641ab72641f" exitCode=0 Oct 02 11:15:20 crc kubenswrapper[4835]: I1002 11:15:20.115725 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rk79x" event={"ID":"e627031e-a0d4-459c-9250-bfdcf645d133","Type":"ContainerDied","Data":"9ee76de918bc4e19b8144d3bd91d16557d815e1f1a8213cead7b8641ab72641f"} Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.120101 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4p7h4"] Oct 02 11:15:21 crc kubenswrapper[4835]: E1002 11:15:21.126673 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2905629f-e865-4e05-a222-a84e1fa0b88a" containerName="collect-profiles" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.126800 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2905629f-e865-4e05-a222-a84e1fa0b88a" containerName="collect-profiles" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.127093 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2905629f-e865-4e05-a222-a84e1fa0b88a" containerName="collect-profiles" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.128061 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4p7h4" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.132957 4835 generic.go:334] "Generic (PLEG): container finished" podID="fc534d7c-ef08-44e5-b56d-d3421477c51d" containerID="cd596dd7792ef11c94784376d1552c2863357281364a72de1abc0804776a24a8" exitCode=0 Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.133198 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2vdk6" event={"ID":"fc534d7c-ef08-44e5-b56d-d3421477c51d","Type":"ContainerDied","Data":"cd596dd7792ef11c94784376d1552c2863357281364a72de1abc0804776a24a8"} Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.138846 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4p7h4"] Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.197418 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhxl9\" (UniqueName: \"kubernetes.io/projected/0be5aafd-caf7-4d3e-a651-774f264ff938-kube-api-access-vhxl9\") pod \"nova-api-db-create-4p7h4\" (UID: \"0be5aafd-caf7-4d3e-a651-774f264ff938\") " pod="openstack/nova-api-db-create-4p7h4" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.234363 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5nwmn"] Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.235862 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5nwmn" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.265132 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5nwmn"] Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.301454 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmc9f\" (UniqueName: \"kubernetes.io/projected/7aa9590e-e7df-41eb-9dd8-d22a2f382b94-kube-api-access-pmc9f\") pod \"nova-cell0-db-create-5nwmn\" (UID: \"7aa9590e-e7df-41eb-9dd8-d22a2f382b94\") " pod="openstack/nova-cell0-db-create-5nwmn" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.301569 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhxl9\" (UniqueName: \"kubernetes.io/projected/0be5aafd-caf7-4d3e-a651-774f264ff938-kube-api-access-vhxl9\") pod \"nova-api-db-create-4p7h4\" (UID: \"0be5aafd-caf7-4d3e-a651-774f264ff938\") " pod="openstack/nova-api-db-create-4p7h4" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.333592 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-glqq5"] Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.335086 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-glqq5" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.343450 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhxl9\" (UniqueName: \"kubernetes.io/projected/0be5aafd-caf7-4d3e-a651-774f264ff938-kube-api-access-vhxl9\") pod \"nova-api-db-create-4p7h4\" (UID: \"0be5aafd-caf7-4d3e-a651-774f264ff938\") " pod="openstack/nova-api-db-create-4p7h4" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.366906 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-glqq5"] Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.405089 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmc9f\" (UniqueName: \"kubernetes.io/projected/7aa9590e-e7df-41eb-9dd8-d22a2f382b94-kube-api-access-pmc9f\") pod \"nova-cell0-db-create-5nwmn\" (UID: \"7aa9590e-e7df-41eb-9dd8-d22a2f382b94\") " pod="openstack/nova-cell0-db-create-5nwmn" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.405173 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zngg\" (UniqueName: \"kubernetes.io/projected/fc6d4089-7586-4a01-a18e-7cdb9da91783-kube-api-access-9zngg\") pod \"nova-cell1-db-create-glqq5\" (UID: \"fc6d4089-7586-4a01-a18e-7cdb9da91783\") " pod="openstack/nova-cell1-db-create-glqq5" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.446453 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4p7h4" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.446771 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmc9f\" (UniqueName: \"kubernetes.io/projected/7aa9590e-e7df-41eb-9dd8-d22a2f382b94-kube-api-access-pmc9f\") pod \"nova-cell0-db-create-5nwmn\" (UID: \"7aa9590e-e7df-41eb-9dd8-d22a2f382b94\") " pod="openstack/nova-cell0-db-create-5nwmn" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.506912 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zngg\" (UniqueName: \"kubernetes.io/projected/fc6d4089-7586-4a01-a18e-7cdb9da91783-kube-api-access-9zngg\") pod \"nova-cell1-db-create-glqq5\" (UID: \"fc6d4089-7586-4a01-a18e-7cdb9da91783\") " pod="openstack/nova-cell1-db-create-glqq5" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.534564 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zngg\" (UniqueName: \"kubernetes.io/projected/fc6d4089-7586-4a01-a18e-7cdb9da91783-kube-api-access-9zngg\") pod \"nova-cell1-db-create-glqq5\" (UID: \"fc6d4089-7586-4a01-a18e-7cdb9da91783\") " pod="openstack/nova-cell1-db-create-glqq5" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.561251 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5nwmn" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.676543 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rk79x" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.709197 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-glqq5" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.812494 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-db-sync-config-data\") pod \"e627031e-a0d4-459c-9250-bfdcf645d133\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.812565 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t69tz\" (UniqueName: \"kubernetes.io/projected/e627031e-a0d4-459c-9250-bfdcf645d133-kube-api-access-t69tz\") pod \"e627031e-a0d4-459c-9250-bfdcf645d133\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.812612 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-combined-ca-bundle\") pod \"e627031e-a0d4-459c-9250-bfdcf645d133\" (UID: \"e627031e-a0d4-459c-9250-bfdcf645d133\") " Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.817157 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e627031e-a0d4-459c-9250-bfdcf645d133" (UID: "e627031e-a0d4-459c-9250-bfdcf645d133"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.819881 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e627031e-a0d4-459c-9250-bfdcf645d133-kube-api-access-t69tz" (OuterVolumeSpecName: "kube-api-access-t69tz") pod "e627031e-a0d4-459c-9250-bfdcf645d133" (UID: "e627031e-a0d4-459c-9250-bfdcf645d133"). InnerVolumeSpecName "kube-api-access-t69tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.851716 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e627031e-a0d4-459c-9250-bfdcf645d133" (UID: "e627031e-a0d4-459c-9250-bfdcf645d133"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.915781 4835 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.915838 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t69tz\" (UniqueName: \"kubernetes.io/projected/e627031e-a0d4-459c-9250-bfdcf645d133-kube-api-access-t69tz\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:21 crc kubenswrapper[4835]: I1002 11:15:21.915854 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e627031e-a0d4-459c-9250-bfdcf645d133-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:22 crc kubenswrapper[4835]: W1002 11:15:22.077963 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0be5aafd_caf7_4d3e_a651_774f264ff938.slice/crio-6fe4360aa8ac483174e69960fe8c59e54f005190466f57dcebcd48ddf582822b WatchSource:0}: Error finding container 6fe4360aa8ac483174e69960fe8c59e54f005190466f57dcebcd48ddf582822b: Status 404 returned error can't find the container with id 6fe4360aa8ac483174e69960fe8c59e54f005190466f57dcebcd48ddf582822b Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.084678 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4p7h4"] Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.157496 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4p7h4" event={"ID":"0be5aafd-caf7-4d3e-a651-774f264ff938","Type":"ContainerStarted","Data":"6fe4360aa8ac483174e69960fe8c59e54f005190466f57dcebcd48ddf582822b"} Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.178796 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5nwmn"] Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.195125 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rk79x" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.195283 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rk79x" event={"ID":"e627031e-a0d4-459c-9250-bfdcf645d133","Type":"ContainerDied","Data":"c5a409133c595f798ef634ee581676581bf3fd63fd38629a70fdb53db55d4f82"} Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.197349 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5a409133c595f798ef634ee581676581bf3fd63fd38629a70fdb53db55d4f82" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.325172 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-glqq5"] Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.567705 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5547df6bdc-2nktk"] Oct 02 11:15:22 crc kubenswrapper[4835]: E1002 11:15:22.568566 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e627031e-a0d4-459c-9250-bfdcf645d133" containerName="barbican-db-sync" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.568595 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e627031e-a0d4-459c-9250-bfdcf645d133" containerName="barbican-db-sync" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.568819 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e627031e-a0d4-459c-9250-bfdcf645d133" containerName="barbican-db-sync" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.569989 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.573963 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.574363 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.574526 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4fkhm" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.604906 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4"] Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.607121 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.611621 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.641229 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e1683aa-2540-417c-8334-10082451475b-logs\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.641425 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1683aa-2540-417c-8334-10082451475b-config-data\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.641510 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kqpr\" (UniqueName: \"kubernetes.io/projected/6e1683aa-2540-417c-8334-10082451475b-kube-api-access-7kqpr\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.641621 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1683aa-2540-417c-8334-10082451475b-combined-ca-bundle\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.641708 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e1683aa-2540-417c-8334-10082451475b-config-data-custom\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.648160 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4"] Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.688977 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5547df6bdc-2nktk"] Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.744048 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kqpr\" (UniqueName: \"kubernetes.io/projected/6e1683aa-2540-417c-8334-10082451475b-kube-api-access-7kqpr\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.744130 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6557e02c-d16a-4b3b-8d22-00662118e581-combined-ca-bundle\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.744168 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6557e02c-d16a-4b3b-8d22-00662118e581-logs\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.744215 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6557e02c-d16a-4b3b-8d22-00662118e581-config-data-custom\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.744266 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1683aa-2540-417c-8334-10082451475b-combined-ca-bundle\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.744397 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e1683aa-2540-417c-8334-10082451475b-config-data-custom\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.744501 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6557e02c-d16a-4b3b-8d22-00662118e581-config-data\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.744622 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e1683aa-2540-417c-8334-10082451475b-logs\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.744693 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-645jp\" (UniqueName: \"kubernetes.io/projected/6557e02c-d16a-4b3b-8d22-00662118e581-kube-api-access-645jp\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.744834 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1683aa-2540-417c-8334-10082451475b-config-data\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.755901 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nlwll"] Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.757198 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e1683aa-2540-417c-8334-10082451475b-logs\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.757742 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.766635 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1683aa-2540-417c-8334-10082451475b-combined-ca-bundle\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.786905 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nlwll"] Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.794746 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1683aa-2540-417c-8334-10082451475b-config-data\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.796259 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kqpr\" (UniqueName: \"kubernetes.io/projected/6e1683aa-2540-417c-8334-10082451475b-kube-api-access-7kqpr\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.807379 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e1683aa-2540-417c-8334-10082451475b-config-data-custom\") pod \"barbican-worker-5547df6bdc-2nktk\" (UID: \"6e1683aa-2540-417c-8334-10082451475b\") " pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.861768 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfxr6\" (UniqueName: \"kubernetes.io/projected/27eb26b3-23ed-42a0-949c-fc1cce4056f1-kube-api-access-nfxr6\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.861825 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.861866 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6557e02c-d16a-4b3b-8d22-00662118e581-logs\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.861894 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6557e02c-d16a-4b3b-8d22-00662118e581-config-data-custom\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.861947 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6557e02c-d16a-4b3b-8d22-00662118e581-config-data\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.861974 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-config\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.861989 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.862049 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-645jp\" (UniqueName: \"kubernetes.io/projected/6557e02c-d16a-4b3b-8d22-00662118e581-kube-api-access-645jp\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.862064 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-dns-svc\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.862166 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6557e02c-d16a-4b3b-8d22-00662118e581-combined-ca-bundle\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.863321 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6557e02c-d16a-4b3b-8d22-00662118e581-logs\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.874207 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6557e02c-d16a-4b3b-8d22-00662118e581-config-data-custom\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.875093 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6557e02c-d16a-4b3b-8d22-00662118e581-config-data\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.877603 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6557e02c-d16a-4b3b-8d22-00662118e581-combined-ca-bundle\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.881427 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b69ffbbb-ht5c9"] Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.883565 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.888596 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.894999 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b69ffbbb-ht5c9"] Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.919467 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-645jp\" (UniqueName: \"kubernetes.io/projected/6557e02c-d16a-4b3b-8d22-00662118e581-kube-api-access-645jp\") pod \"barbican-keystone-listener-7cfcd5b5b-2xqb4\" (UID: \"6557e02c-d16a-4b3b-8d22-00662118e581\") " pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.960270 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5547df6bdc-2nktk" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.962472 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.964194 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data-custom\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.964259 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-config\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.964282 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.964328 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-dns-svc\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.964375 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.964416 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6g8v\" (UniqueName: \"kubernetes.io/projected/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-kube-api-access-m6g8v\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.964505 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-logs\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.964537 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-combined-ca-bundle\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.964573 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfxr6\" (UniqueName: \"kubernetes.io/projected/27eb26b3-23ed-42a0-949c-fc1cce4056f1-kube-api-access-nfxr6\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.964599 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.966077 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.967601 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-config\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.968110 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.968658 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-dns-svc\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:22 crc kubenswrapper[4835]: I1002 11:15:22.985348 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.029376 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfxr6\" (UniqueName: \"kubernetes.io/projected/27eb26b3-23ed-42a0-949c-fc1cce4056f1-kube-api-access-nfxr6\") pod \"dnsmasq-dns-699df9757c-nlwll\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.070942 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4svdb\" (UniqueName: \"kubernetes.io/projected/fc534d7c-ef08-44e5-b56d-d3421477c51d-kube-api-access-4svdb\") pod \"fc534d7c-ef08-44e5-b56d-d3421477c51d\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.071455 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-scripts\") pod \"fc534d7c-ef08-44e5-b56d-d3421477c51d\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.071541 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-config-data\") pod \"fc534d7c-ef08-44e5-b56d-d3421477c51d\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.071598 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-combined-ca-bundle\") pod \"fc534d7c-ef08-44e5-b56d-d3421477c51d\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.071668 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-db-sync-config-data\") pod \"fc534d7c-ef08-44e5-b56d-d3421477c51d\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.071692 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc534d7c-ef08-44e5-b56d-d3421477c51d-etc-machine-id\") pod \"fc534d7c-ef08-44e5-b56d-d3421477c51d\" (UID: \"fc534d7c-ef08-44e5-b56d-d3421477c51d\") " Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.072024 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data-custom\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.072173 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.072204 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6g8v\" (UniqueName: \"kubernetes.io/projected/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-kube-api-access-m6g8v\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.072286 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-logs\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.072317 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-combined-ca-bundle\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.080313 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-scripts" (OuterVolumeSpecName: "scripts") pod "fc534d7c-ef08-44e5-b56d-d3421477c51d" (UID: "fc534d7c-ef08-44e5-b56d-d3421477c51d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.081798 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc534d7c-ef08-44e5-b56d-d3421477c51d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fc534d7c-ef08-44e5-b56d-d3421477c51d" (UID: "fc534d7c-ef08-44e5-b56d-d3421477c51d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.082162 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-logs\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.086465 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc534d7c-ef08-44e5-b56d-d3421477c51d-kube-api-access-4svdb" (OuterVolumeSpecName: "kube-api-access-4svdb") pod "fc534d7c-ef08-44e5-b56d-d3421477c51d" (UID: "fc534d7c-ef08-44e5-b56d-d3421477c51d"). InnerVolumeSpecName "kube-api-access-4svdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.087375 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data-custom\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.103133 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6g8v\" (UniqueName: \"kubernetes.io/projected/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-kube-api-access-m6g8v\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.107653 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fc534d7c-ef08-44e5-b56d-d3421477c51d" (UID: "fc534d7c-ef08-44e5-b56d-d3421477c51d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.110308 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.112326 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-combined-ca-bundle\") pod \"barbican-api-7b69ffbbb-ht5c9\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.174662 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4svdb\" (UniqueName: \"kubernetes.io/projected/fc534d7c-ef08-44e5-b56d-d3421477c51d-kube-api-access-4svdb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.174697 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.174706 4835 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.174735 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc534d7c-ef08-44e5-b56d-d3421477c51d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.192105 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc534d7c-ef08-44e5-b56d-d3421477c51d" (UID: "fc534d7c-ef08-44e5-b56d-d3421477c51d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.217503 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-config-data" (OuterVolumeSpecName: "config-data") pod "fc534d7c-ef08-44e5-b56d-d3421477c51d" (UID: "fc534d7c-ef08-44e5-b56d-d3421477c51d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.231396 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-glqq5" event={"ID":"fc6d4089-7586-4a01-a18e-7cdb9da91783","Type":"ContainerStarted","Data":"437a0cbe29e379a31f823500d05a253be92ecd244eea66fdb4f15d5f0ebe61a0"} Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.231448 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-glqq5" event={"ID":"fc6d4089-7586-4a01-a18e-7cdb9da91783","Type":"ContainerStarted","Data":"44ef33bedd825056e82c08262f51fa526731dee1e3c09c282e8cf9f8ef663943"} Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.253775 4835 generic.go:334] "Generic (PLEG): container finished" podID="0be5aafd-caf7-4d3e-a651-774f264ff938" containerID="5f94f9e7417bf348c29cadfb2e873a282041a1a30fd5ae756807f4d33b8acdb5" exitCode=0 Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.253922 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4p7h4" event={"ID":"0be5aafd-caf7-4d3e-a651-774f264ff938","Type":"ContainerDied","Data":"5f94f9e7417bf348c29cadfb2e873a282041a1a30fd5ae756807f4d33b8acdb5"} Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.257862 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.261894 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-glqq5" podStartSLOduration=2.261867122 podStartE2EDuration="2.261867122s" podCreationTimestamp="2025-10-02 11:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:23.253813282 +0000 UTC m=+1199.813720863" watchObservedRunningTime="2025-10-02 11:15:23.261867122 +0000 UTC m=+1199.821774703" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.264982 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5nwmn" event={"ID":"7aa9590e-e7df-41eb-9dd8-d22a2f382b94","Type":"ContainerStarted","Data":"263b248f928928c12d3e51bd98da87292f41d4d0eb1b0152d162c77501978a29"} Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.265042 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5nwmn" event={"ID":"7aa9590e-e7df-41eb-9dd8-d22a2f382b94","Type":"ContainerStarted","Data":"bc8973937f8f4ef586b561ac7418346dd690f199c01888b5a3b5cfb2b5f198b0"} Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.266513 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.277350 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.277383 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc534d7c-ef08-44e5-b56d-d3421477c51d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.291580 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2vdk6" event={"ID":"fc534d7c-ef08-44e5-b56d-d3421477c51d","Type":"ContainerDied","Data":"1ce615fb1caaf17c8de25aee44904ab29fb81696cd112509a16666576484a582"} Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.291624 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ce615fb1caaf17c8de25aee44904ab29fb81696cd112509a16666576484a582" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.291710 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2vdk6" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.552769 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:23 crc kubenswrapper[4835]: E1002 11:15:23.553648 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc534d7c-ef08-44e5-b56d-d3421477c51d" containerName="cinder-db-sync" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.553669 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc534d7c-ef08-44e5-b56d-d3421477c51d" containerName="cinder-db-sync" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.553904 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc534d7c-ef08-44e5-b56d-d3421477c51d" containerName="cinder-db-sync" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.555056 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.558121 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.558386 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.558527 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lmqcj" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.558532 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.630673 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.678465 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nlwll"] Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.695836 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.695930 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9v7w\" (UniqueName: \"kubernetes.io/projected/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-kube-api-access-g9v7w\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.696013 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.696049 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.696077 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.696150 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.761686 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b76cdf485-qt4lf"] Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.764942 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.771956 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b76cdf485-qt4lf"] Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.800526 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.800558 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.800578 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.800611 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.800855 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.800892 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9v7w\" (UniqueName: \"kubernetes.io/projected/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-kube-api-access-g9v7w\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.803082 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.806840 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5547df6bdc-2nktk"] Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.810588 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.810902 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.816496 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.826752 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.842941 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9v7w\" (UniqueName: \"kubernetes.io/projected/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-kube-api-access-g9v7w\") pod \"cinder-scheduler-0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.869709 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.880289 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.884009 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.896751 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.898265 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4"] Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.908441 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7x7\" (UniqueName: \"kubernetes.io/projected/9c187566-73e0-40aa-a7db-a0cde5431b83-kube-api-access-vl7x7\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.908511 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-sb\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.908547 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-nb\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.908588 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-config\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.908631 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-dns-svc\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:23 crc kubenswrapper[4835]: I1002 11:15:23.924766 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010465 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f1bef34-8be8-4329-a61b-9949a5d9493f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010521 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010568 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dfm4\" (UniqueName: \"kubernetes.io/projected/3f1bef34-8be8-4329-a61b-9949a5d9493f-kube-api-access-8dfm4\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010631 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010657 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010748 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7x7\" (UniqueName: \"kubernetes.io/projected/9c187566-73e0-40aa-a7db-a0cde5431b83-kube-api-access-vl7x7\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010772 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-scripts\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010813 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-sb\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010845 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-nb\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010889 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f1bef34-8be8-4329-a61b-9949a5d9493f-logs\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010914 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-config\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.010966 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-dns-svc\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.012354 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-sb\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.012418 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-nb\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.013152 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-config\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.021363 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-dns-svc\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.032207 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7x7\" (UniqueName: \"kubernetes.io/projected/9c187566-73e0-40aa-a7db-a0cde5431b83-kube-api-access-vl7x7\") pod \"dnsmasq-dns-5b76cdf485-qt4lf\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.067778 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b69ffbbb-ht5c9"] Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.112585 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f1bef34-8be8-4329-a61b-9949a5d9493f-logs\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.112697 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f1bef34-8be8-4329-a61b-9949a5d9493f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.112725 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.112773 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dfm4\" (UniqueName: \"kubernetes.io/projected/3f1bef34-8be8-4329-a61b-9949a5d9493f-kube-api-access-8dfm4\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.112827 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f1bef34-8be8-4329-a61b-9949a5d9493f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.112832 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.112910 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.112987 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-scripts\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.113790 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f1bef34-8be8-4329-a61b-9949a5d9493f-logs\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.124177 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.125625 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nlwll"] Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.130110 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.137580 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-scripts\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.137581 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.140792 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dfm4\" (UniqueName: \"kubernetes.io/projected/3f1bef34-8be8-4329-a61b-9949a5d9493f-kube-api-access-8dfm4\") pod \"cinder-api-0\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: W1002 11:15:24.172336 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27eb26b3_23ed_42a0_949c_fc1cce4056f1.slice/crio-4a0d208c0d71830c52463d794932b72095f52d809b0812d6884485369e53caee WatchSource:0}: Error finding container 4a0d208c0d71830c52463d794932b72095f52d809b0812d6884485369e53caee: Status 404 returned error can't find the container with id 4a0d208c0d71830c52463d794932b72095f52d809b0812d6884485369e53caee Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.254084 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.310704 4835 generic.go:334] "Generic (PLEG): container finished" podID="fc6d4089-7586-4a01-a18e-7cdb9da91783" containerID="437a0cbe29e379a31f823500d05a253be92ecd244eea66fdb4f15d5f0ebe61a0" exitCode=0 Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.311162 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-glqq5" event={"ID":"fc6d4089-7586-4a01-a18e-7cdb9da91783","Type":"ContainerDied","Data":"437a0cbe29e379a31f823500d05a253be92ecd244eea66fdb4f15d5f0ebe61a0"} Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.316273 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5547df6bdc-2nktk" event={"ID":"6e1683aa-2540-417c-8334-10082451475b","Type":"ContainerStarted","Data":"ad84411329e10ed61c072f1be40764561fc5f0706deec9d28df1555b702c2096"} Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.333178 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.337288 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-nlwll" event={"ID":"27eb26b3-23ed-42a0-949c-fc1cce4056f1","Type":"ContainerStarted","Data":"4a0d208c0d71830c52463d794932b72095f52d809b0812d6884485369e53caee"} Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.340716 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" event={"ID":"6557e02c-d16a-4b3b-8d22-00662118e581","Type":"ContainerStarted","Data":"6a805fc1e4d1b1252576fabbed8590dee06b929e3a39265048a4617b7b4b8c99"} Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.346670 4835 generic.go:334] "Generic (PLEG): container finished" podID="7aa9590e-e7df-41eb-9dd8-d22a2f382b94" containerID="263b248f928928c12d3e51bd98da87292f41d4d0eb1b0152d162c77501978a29" exitCode=0 Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.346774 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5nwmn" event={"ID":"7aa9590e-e7df-41eb-9dd8-d22a2f382b94","Type":"ContainerDied","Data":"263b248f928928c12d3e51bd98da87292f41d4d0eb1b0152d162c77501978a29"} Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.356670 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b69ffbbb-ht5c9" event={"ID":"813d1ab2-14f3-4726-9b9a-3fe1be2a1395","Type":"ContainerStarted","Data":"c54ec2f27569fb4e84b5015404155a0f96745d34fbea4f63ca42bdbabbd0a43d"} Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.528387 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.944400 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5nwmn" Oct 02 11:15:24 crc kubenswrapper[4835]: I1002 11:15:24.972080 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4p7h4" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.054507 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhxl9\" (UniqueName: \"kubernetes.io/projected/0be5aafd-caf7-4d3e-a651-774f264ff938-kube-api-access-vhxl9\") pod \"0be5aafd-caf7-4d3e-a651-774f264ff938\" (UID: \"0be5aafd-caf7-4d3e-a651-774f264ff938\") " Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.054730 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmc9f\" (UniqueName: \"kubernetes.io/projected/7aa9590e-e7df-41eb-9dd8-d22a2f382b94-kube-api-access-pmc9f\") pod \"7aa9590e-e7df-41eb-9dd8-d22a2f382b94\" (UID: \"7aa9590e-e7df-41eb-9dd8-d22a2f382b94\") " Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.067278 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be5aafd-caf7-4d3e-a651-774f264ff938-kube-api-access-vhxl9" (OuterVolumeSpecName: "kube-api-access-vhxl9") pod "0be5aafd-caf7-4d3e-a651-774f264ff938" (UID: "0be5aafd-caf7-4d3e-a651-774f264ff938"). InnerVolumeSpecName "kube-api-access-vhxl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.082257 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa9590e-e7df-41eb-9dd8-d22a2f382b94-kube-api-access-pmc9f" (OuterVolumeSpecName: "kube-api-access-pmc9f") pod "7aa9590e-e7df-41eb-9dd8-d22a2f382b94" (UID: "7aa9590e-e7df-41eb-9dd8-d22a2f382b94"). InnerVolumeSpecName "kube-api-access-pmc9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.160750 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhxl9\" (UniqueName: \"kubernetes.io/projected/0be5aafd-caf7-4d3e-a651-774f264ff938-kube-api-access-vhxl9\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.160805 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmc9f\" (UniqueName: \"kubernetes.io/projected/7aa9590e-e7df-41eb-9dd8-d22a2f382b94-kube-api-access-pmc9f\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.199169 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.207591 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b76cdf485-qt4lf"] Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.310415 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.378398 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0","Type":"ContainerStarted","Data":"697160650c044cad114ac9e748180dc38410f1ee8be2fd988374974b05884166"} Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.414152 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4p7h4" event={"ID":"0be5aafd-caf7-4d3e-a651-774f264ff938","Type":"ContainerDied","Data":"6fe4360aa8ac483174e69960fe8c59e54f005190466f57dcebcd48ddf582822b"} Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.414201 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fe4360aa8ac483174e69960fe8c59e54f005190466f57dcebcd48ddf582822b" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.414318 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4p7h4" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.434586 4835 generic.go:334] "Generic (PLEG): container finished" podID="27eb26b3-23ed-42a0-949c-fc1cce4056f1" containerID="bc2ed4f9b2148882c2ed43255488e6a81d22645623557580678a23d02ea275b6" exitCode=0 Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.434667 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-nlwll" event={"ID":"27eb26b3-23ed-42a0-949c-fc1cce4056f1","Type":"ContainerDied","Data":"bc2ed4f9b2148882c2ed43255488e6a81d22645623557580678a23d02ea275b6"} Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.461531 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5nwmn" event={"ID":"7aa9590e-e7df-41eb-9dd8-d22a2f382b94","Type":"ContainerDied","Data":"bc8973937f8f4ef586b561ac7418346dd690f199c01888b5a3b5cfb2b5f198b0"} Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.461579 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc8973937f8f4ef586b561ac7418346dd690f199c01888b5a3b5cfb2b5f198b0" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.461652 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5nwmn" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.520370 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b69ffbbb-ht5c9" event={"ID":"813d1ab2-14f3-4726-9b9a-3fe1be2a1395","Type":"ContainerStarted","Data":"f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4"} Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.520442 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.520456 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b69ffbbb-ht5c9" event={"ID":"813d1ab2-14f3-4726-9b9a-3fe1be2a1395","Type":"ContainerStarted","Data":"02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe"} Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.520482 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:25 crc kubenswrapper[4835]: I1002 11:15:25.605642 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b69ffbbb-ht5c9" podStartSLOduration=3.605617545 podStartE2EDuration="3.605617545s" podCreationTimestamp="2025-10-02 11:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:25.567821971 +0000 UTC m=+1202.127729562" watchObservedRunningTime="2025-10-02 11:15:25.605617545 +0000 UTC m=+1202.165525126" Oct 02 11:15:26 crc kubenswrapper[4835]: I1002 11:15:26.530711 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" event={"ID":"9c187566-73e0-40aa-a7db-a0cde5431b83","Type":"ContainerStarted","Data":"e204b616555b316e8b49a0d45241971461c04141498dd85b0cf3e8405009c4cf"} Oct 02 11:15:26 crc kubenswrapper[4835]: I1002 11:15:26.534583 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3f1bef34-8be8-4329-a61b-9949a5d9493f","Type":"ContainerStarted","Data":"93da7f7c96d879904ea56337e54d3627807af29a1cce98f44b34119351efee09"} Oct 02 11:15:26 crc kubenswrapper[4835]: I1002 11:15:26.987171 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:26 crc kubenswrapper[4835]: I1002 11:15:26.992631 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-glqq5" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.114341 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zngg\" (UniqueName: \"kubernetes.io/projected/fc6d4089-7586-4a01-a18e-7cdb9da91783-kube-api-access-9zngg\") pod \"fc6d4089-7586-4a01-a18e-7cdb9da91783\" (UID: \"fc6d4089-7586-4a01-a18e-7cdb9da91783\") " Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.114740 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfxr6\" (UniqueName: \"kubernetes.io/projected/27eb26b3-23ed-42a0-949c-fc1cce4056f1-kube-api-access-nfxr6\") pod \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.114841 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-dns-svc\") pod \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.114970 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-config\") pod \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.115000 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-nb\") pod \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.115682 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-sb\") pod \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\" (UID: \"27eb26b3-23ed-42a0-949c-fc1cce4056f1\") " Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.137870 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc6d4089-7586-4a01-a18e-7cdb9da91783-kube-api-access-9zngg" (OuterVolumeSpecName: "kube-api-access-9zngg") pod "fc6d4089-7586-4a01-a18e-7cdb9da91783" (UID: "fc6d4089-7586-4a01-a18e-7cdb9da91783"). InnerVolumeSpecName "kube-api-access-9zngg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.153596 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27eb26b3-23ed-42a0-949c-fc1cce4056f1" (UID: "27eb26b3-23ed-42a0-949c-fc1cce4056f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.159910 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "27eb26b3-23ed-42a0-949c-fc1cce4056f1" (UID: "27eb26b3-23ed-42a0-949c-fc1cce4056f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.162239 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27eb26b3-23ed-42a0-949c-fc1cce4056f1" (UID: "27eb26b3-23ed-42a0-949c-fc1cce4056f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.182630 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27eb26b3-23ed-42a0-949c-fc1cce4056f1-kube-api-access-nfxr6" (OuterVolumeSpecName: "kube-api-access-nfxr6") pod "27eb26b3-23ed-42a0-949c-fc1cce4056f1" (UID: "27eb26b3-23ed-42a0-949c-fc1cce4056f1"). InnerVolumeSpecName "kube-api-access-nfxr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.185009 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-config" (OuterVolumeSpecName: "config") pod "27eb26b3-23ed-42a0-949c-fc1cce4056f1" (UID: "27eb26b3-23ed-42a0-949c-fc1cce4056f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.219883 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.219931 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zngg\" (UniqueName: \"kubernetes.io/projected/fc6d4089-7586-4a01-a18e-7cdb9da91783-kube-api-access-9zngg\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.219948 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfxr6\" (UniqueName: \"kubernetes.io/projected/27eb26b3-23ed-42a0-949c-fc1cce4056f1-kube-api-access-nfxr6\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.219965 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.219982 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.219994 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27eb26b3-23ed-42a0-949c-fc1cce4056f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.314371 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.544570 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-glqq5" event={"ID":"fc6d4089-7586-4a01-a18e-7cdb9da91783","Type":"ContainerDied","Data":"44ef33bedd825056e82c08262f51fa526731dee1e3c09c282e8cf9f8ef663943"} Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.544605 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-glqq5" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.544613 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44ef33bedd825056e82c08262f51fa526731dee1e3c09c282e8cf9f8ef663943" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.546231 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-nlwll" event={"ID":"27eb26b3-23ed-42a0-949c-fc1cce4056f1","Type":"ContainerDied","Data":"4a0d208c0d71830c52463d794932b72095f52d809b0812d6884485369e53caee"} Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.546252 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-nlwll" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.546264 4835 scope.go:117] "RemoveContainer" containerID="bc2ed4f9b2148882c2ed43255488e6a81d22645623557580678a23d02ea275b6" Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.548111 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3f1bef34-8be8-4329-a61b-9949a5d9493f","Type":"ContainerStarted","Data":"9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117"} Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.550838 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0","Type":"ContainerStarted","Data":"4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8"} Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.564984 4835 generic.go:334] "Generic (PLEG): container finished" podID="9c187566-73e0-40aa-a7db-a0cde5431b83" containerID="7b371cec8c743bde02b3a84e6736dc65f9b7693beaccf5bc0c8dce9886bf92a0" exitCode=0 Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.565026 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" event={"ID":"9c187566-73e0-40aa-a7db-a0cde5431b83","Type":"ContainerDied","Data":"7b371cec8c743bde02b3a84e6736dc65f9b7693beaccf5bc0c8dce9886bf92a0"} Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.606875 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nlwll"] Oct 02 11:15:27 crc kubenswrapper[4835]: I1002 11:15:27.641437 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nlwll"] Oct 02 11:15:28 crc kubenswrapper[4835]: I1002 11:15:28.280936 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27eb26b3-23ed-42a0-949c-fc1cce4056f1" path="/var/lib/kubelet/pods/27eb26b3-23ed-42a0-949c-fc1cce4056f1/volumes" Oct 02 11:15:28 crc kubenswrapper[4835]: I1002 11:15:28.593475 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" event={"ID":"9c187566-73e0-40aa-a7db-a0cde5431b83","Type":"ContainerStarted","Data":"8ef9aa002581037af6725b688d3bc63cb018749d0d9c7edbcc233bf191cf618a"} Oct 02 11:15:28 crc kubenswrapper[4835]: I1002 11:15:28.593835 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:28 crc kubenswrapper[4835]: I1002 11:15:28.607181 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5547df6bdc-2nktk" event={"ID":"6e1683aa-2540-417c-8334-10082451475b","Type":"ContainerStarted","Data":"eb72c70f4574ddedc07234b80635a052ff2ee1df9c4646d2b4fe95f1f264978c"} Oct 02 11:15:28 crc kubenswrapper[4835]: I1002 11:15:28.618457 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" event={"ID":"6557e02c-d16a-4b3b-8d22-00662118e581","Type":"ContainerStarted","Data":"8e3865390a7d836f918e72e0dc06ac4a7d4d7a97f45f8e06a21c2a06fd64407d"} Oct 02 11:15:28 crc kubenswrapper[4835]: I1002 11:15:28.621118 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" podStartSLOduration=5.62109859 podStartE2EDuration="5.62109859s" podCreationTimestamp="2025-10-02 11:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:28.61760601 +0000 UTC m=+1205.177513591" watchObservedRunningTime="2025-10-02 11:15:28.62109859 +0000 UTC m=+1205.181006171" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.630341 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0","Type":"ContainerStarted","Data":"dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847"} Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.646994 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5547df6bdc-2nktk" event={"ID":"6e1683aa-2540-417c-8334-10082451475b","Type":"ContainerStarted","Data":"65af9f782c9300b9c2765229aedbab853b3c120e292296255d21169ff6be42cd"} Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.654190 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.470287965 podStartE2EDuration="6.654172585s" podCreationTimestamp="2025-10-02 11:15:23 +0000 UTC" firstStartedPulling="2025-10-02 11:15:24.572414466 +0000 UTC m=+1201.132322047" lastFinishedPulling="2025-10-02 11:15:25.756299076 +0000 UTC m=+1202.316206667" observedRunningTime="2025-10-02 11:15:29.646705331 +0000 UTC m=+1206.206612912" watchObservedRunningTime="2025-10-02 11:15:29.654172585 +0000 UTC m=+1206.214080166" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.654492 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" event={"ID":"6557e02c-d16a-4b3b-8d22-00662118e581","Type":"ContainerStarted","Data":"66cecd0640f6b14873be2c9c690053faf5b65528672ce8771a86789591c82728"} Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.659720 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3f1bef34-8be8-4329-a61b-9949a5d9493f","Type":"ContainerStarted","Data":"b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da"} Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.660021 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3f1bef34-8be8-4329-a61b-9949a5d9493f" containerName="cinder-api-log" containerID="cri-o://9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117" gracePeriod=30 Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.660062 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3f1bef34-8be8-4329-a61b-9949a5d9493f" containerName="cinder-api" containerID="cri-o://b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da" gracePeriod=30 Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.696931 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5547df6bdc-2nktk" podStartSLOduration=3.64976228 podStartE2EDuration="7.69691097s" podCreationTimestamp="2025-10-02 11:15:22 +0000 UTC" firstStartedPulling="2025-10-02 11:15:23.801984672 +0000 UTC m=+1200.361892253" lastFinishedPulling="2025-10-02 11:15:27.849133362 +0000 UTC m=+1204.409040943" observedRunningTime="2025-10-02 11:15:29.679262654 +0000 UTC m=+1206.239170235" watchObservedRunningTime="2025-10-02 11:15:29.69691097 +0000 UTC m=+1206.256818551" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.704754 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.704736235 podStartE2EDuration="6.704736235s" podCreationTimestamp="2025-10-02 11:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:29.694963755 +0000 UTC m=+1206.254871376" watchObservedRunningTime="2025-10-02 11:15:29.704736235 +0000 UTC m=+1206.264643806" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.730084 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7cfcd5b5b-2xqb4" podStartSLOduration=3.763488501 podStartE2EDuration="7.73003302s" podCreationTimestamp="2025-10-02 11:15:22 +0000 UTC" firstStartedPulling="2025-10-02 11:15:23.882499171 +0000 UTC m=+1200.442406752" lastFinishedPulling="2025-10-02 11:15:27.84904369 +0000 UTC m=+1204.408951271" observedRunningTime="2025-10-02 11:15:29.722040431 +0000 UTC m=+1206.281948012" watchObservedRunningTime="2025-10-02 11:15:29.73003302 +0000 UTC m=+1206.289940601" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.833241 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5956b78c54-6g8cs"] Oct 02 11:15:29 crc kubenswrapper[4835]: E1002 11:15:29.833704 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6d4089-7586-4a01-a18e-7cdb9da91783" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.833728 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6d4089-7586-4a01-a18e-7cdb9da91783" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4835]: E1002 11:15:29.833750 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa9590e-e7df-41eb-9dd8-d22a2f382b94" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.833758 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa9590e-e7df-41eb-9dd8-d22a2f382b94" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4835]: E1002 11:15:29.833770 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be5aafd-caf7-4d3e-a651-774f264ff938" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.833776 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be5aafd-caf7-4d3e-a651-774f264ff938" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4835]: E1002 11:15:29.833790 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27eb26b3-23ed-42a0-949c-fc1cce4056f1" containerName="init" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.833795 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="27eb26b3-23ed-42a0-949c-fc1cce4056f1" containerName="init" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.833964 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6d4089-7586-4a01-a18e-7cdb9da91783" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.833985 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa9590e-e7df-41eb-9dd8-d22a2f382b94" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.834005 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be5aafd-caf7-4d3e-a651-774f264ff938" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.834015 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="27eb26b3-23ed-42a0-949c-fc1cce4056f1" containerName="init" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.835039 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.837592 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.838160 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.851595 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5956b78c54-6g8cs"] Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.910449 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8d82\" (UniqueName: \"kubernetes.io/projected/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-kube-api-access-p8d82\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.910500 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-combined-ca-bundle\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.910538 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-config-data\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.910577 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-logs\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.910644 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-public-tls-certs\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.910674 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-config-data-custom\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:29 crc kubenswrapper[4835]: I1002 11:15:29.910714 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-internal-tls-certs\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.012268 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8d82\" (UniqueName: \"kubernetes.io/projected/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-kube-api-access-p8d82\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.012330 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-combined-ca-bundle\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.012365 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-config-data\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.012404 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-logs\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.012460 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-public-tls-certs\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.012484 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-config-data-custom\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.012518 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-internal-tls-certs\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.018713 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-logs\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.039456 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-internal-tls-certs\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.042867 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-config-data\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.044068 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-combined-ca-bundle\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.045614 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-public-tls-certs\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.050909 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-config-data-custom\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.050954 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8d82\" (UniqueName: \"kubernetes.io/projected/9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1-kube-api-access-p8d82\") pod \"barbican-api-5956b78c54-6g8cs\" (UID: \"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1\") " pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.159429 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.652105 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.700539 4835 generic.go:334] "Generic (PLEG): container finished" podID="3f1bef34-8be8-4329-a61b-9949a5d9493f" containerID="b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da" exitCode=0 Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.700572 4835 generic.go:334] "Generic (PLEG): container finished" podID="3f1bef34-8be8-4329-a61b-9949a5d9493f" containerID="9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117" exitCode=143 Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.701601 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.702021 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3f1bef34-8be8-4329-a61b-9949a5d9493f","Type":"ContainerDied","Data":"b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da"} Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.702052 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3f1bef34-8be8-4329-a61b-9949a5d9493f","Type":"ContainerDied","Data":"9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117"} Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.702064 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3f1bef34-8be8-4329-a61b-9949a5d9493f","Type":"ContainerDied","Data":"93da7f7c96d879904ea56337e54d3627807af29a1cce98f44b34119351efee09"} Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.702082 4835 scope.go:117] "RemoveContainer" containerID="b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.728708 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-scripts\") pod \"3f1bef34-8be8-4329-a61b-9949a5d9493f\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.728775 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-combined-ca-bundle\") pod \"3f1bef34-8be8-4329-a61b-9949a5d9493f\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.728863 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dfm4\" (UniqueName: \"kubernetes.io/projected/3f1bef34-8be8-4329-a61b-9949a5d9493f-kube-api-access-8dfm4\") pod \"3f1bef34-8be8-4329-a61b-9949a5d9493f\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.728900 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data\") pod \"3f1bef34-8be8-4329-a61b-9949a5d9493f\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.728999 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f1bef34-8be8-4329-a61b-9949a5d9493f-logs\") pod \"3f1bef34-8be8-4329-a61b-9949a5d9493f\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.729039 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f1bef34-8be8-4329-a61b-9949a5d9493f-etc-machine-id\") pod \"3f1bef34-8be8-4329-a61b-9949a5d9493f\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.729201 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data-custom\") pod \"3f1bef34-8be8-4329-a61b-9949a5d9493f\" (UID: \"3f1bef34-8be8-4329-a61b-9949a5d9493f\") " Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.733308 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f1bef34-8be8-4329-a61b-9949a5d9493f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3f1bef34-8be8-4329-a61b-9949a5d9493f" (UID: "3f1bef34-8be8-4329-a61b-9949a5d9493f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.744415 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f1bef34-8be8-4329-a61b-9949a5d9493f-logs" (OuterVolumeSpecName: "logs") pod "3f1bef34-8be8-4329-a61b-9949a5d9493f" (UID: "3f1bef34-8be8-4329-a61b-9949a5d9493f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.777453 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f1bef34-8be8-4329-a61b-9949a5d9493f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.777623 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3f1bef34-8be8-4329-a61b-9949a5d9493f" (UID: "3f1bef34-8be8-4329-a61b-9949a5d9493f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.793588 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f1bef34-8be8-4329-a61b-9949a5d9493f-kube-api-access-8dfm4" (OuterVolumeSpecName: "kube-api-access-8dfm4") pod "3f1bef34-8be8-4329-a61b-9949a5d9493f" (UID: "3f1bef34-8be8-4329-a61b-9949a5d9493f"). InnerVolumeSpecName "kube-api-access-8dfm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.845532 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-scripts" (OuterVolumeSpecName: "scripts") pod "3f1bef34-8be8-4329-a61b-9949a5d9493f" (UID: "3f1bef34-8be8-4329-a61b-9949a5d9493f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.883683 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.883816 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dfm4\" (UniqueName: \"kubernetes.io/projected/3f1bef34-8be8-4329-a61b-9949a5d9493f-kube-api-access-8dfm4\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.883900 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f1bef34-8be8-4329-a61b-9949a5d9493f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.883973 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.924551 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5956b78c54-6g8cs"] Oct 02 11:15:30 crc kubenswrapper[4835]: I1002 11:15:30.965460 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f1bef34-8be8-4329-a61b-9949a5d9493f" (UID: "3f1bef34-8be8-4329-a61b-9949a5d9493f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.005593 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.022606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data" (OuterVolumeSpecName: "config-data") pod "3f1bef34-8be8-4329-a61b-9949a5d9493f" (UID: "3f1bef34-8be8-4329-a61b-9949a5d9493f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.064627 4835 scope.go:117] "RemoveContainer" containerID="9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.088423 4835 scope.go:117] "RemoveContainer" containerID="b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da" Oct 02 11:15:31 crc kubenswrapper[4835]: E1002 11:15:31.088947 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da\": container with ID starting with b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da not found: ID does not exist" containerID="b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.089012 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da"} err="failed to get container status \"b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da\": rpc error: code = NotFound desc = could not find container \"b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da\": container with ID starting with b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da not found: ID does not exist" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.089056 4835 scope.go:117] "RemoveContainer" containerID="9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117" Oct 02 11:15:31 crc kubenswrapper[4835]: E1002 11:15:31.089386 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117\": container with ID starting with 9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117 not found: ID does not exist" containerID="9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.089426 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117"} err="failed to get container status \"9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117\": rpc error: code = NotFound desc = could not find container \"9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117\": container with ID starting with 9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117 not found: ID does not exist" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.089439 4835 scope.go:117] "RemoveContainer" containerID="b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.092973 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da"} err="failed to get container status \"b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da\": rpc error: code = NotFound desc = could not find container \"b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da\": container with ID starting with b37a8a0a27e977f7aa1a9370e2443355941b8b6e0936511737f7e4b2bf6f32da not found: ID does not exist" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.093000 4835 scope.go:117] "RemoveContainer" containerID="9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.093649 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117"} err="failed to get container status \"9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117\": rpc error: code = NotFound desc = could not find container \"9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117\": container with ID starting with 9884d7f1a0e1f9f32c84032948a54f757ec16042e82243e702d5a28d2626f117 not found: ID does not exist" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.107596 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1bef34-8be8-4329-a61b-9949a5d9493f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.326335 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8999-account-create-xt68d"] Oct 02 11:15:31 crc kubenswrapper[4835]: E1002 11:15:31.326860 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f1bef34-8be8-4329-a61b-9949a5d9493f" containerName="cinder-api" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.326883 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f1bef34-8be8-4329-a61b-9949a5d9493f" containerName="cinder-api" Oct 02 11:15:31 crc kubenswrapper[4835]: E1002 11:15:31.326914 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f1bef34-8be8-4329-a61b-9949a5d9493f" containerName="cinder-api-log" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.326922 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f1bef34-8be8-4329-a61b-9949a5d9493f" containerName="cinder-api-log" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.327126 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f1bef34-8be8-4329-a61b-9949a5d9493f" containerName="cinder-api" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.327149 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f1bef34-8be8-4329-a61b-9949a5d9493f" containerName="cinder-api-log" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.327916 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8999-account-create-xt68d" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.330294 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.348658 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8999-account-create-xt68d"] Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.379791 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.388260 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.422579 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.422842 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t2sv\" (UniqueName: \"kubernetes.io/projected/806ec124-c88e-471a-891f-a76296deb62a-kube-api-access-4t2sv\") pod \"nova-api-8999-account-create-xt68d\" (UID: \"806ec124-c88e-471a-891f-a76296deb62a\") " pod="openstack/nova-api-8999-account-create-xt68d" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.424444 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.436534 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.436568 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.436545 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.443954 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.512837 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cd62-account-create-jfhjr"] Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.514046 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cd62-account-create-jfhjr" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.517750 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.524952 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42e4bd48-e5db-4d06-947f-63223788352f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.525001 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.525058 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-config-data-custom\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.525083 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t2sv\" (UniqueName: \"kubernetes.io/projected/806ec124-c88e-471a-891f-a76296deb62a-kube-api-access-4t2sv\") pod \"nova-api-8999-account-create-xt68d\" (UID: \"806ec124-c88e-471a-891f-a76296deb62a\") " pod="openstack/nova-api-8999-account-create-xt68d" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.525111 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e4bd48-e5db-4d06-947f-63223788352f-logs\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.525165 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.525188 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-config-data\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.525209 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv9g6\" (UniqueName: \"kubernetes.io/projected/42e4bd48-e5db-4d06-947f-63223788352f-kube-api-access-hv9g6\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.525255 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-scripts\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.525277 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.526954 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cd62-account-create-jfhjr"] Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.554883 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t2sv\" (UniqueName: \"kubernetes.io/projected/806ec124-c88e-471a-891f-a76296deb62a-kube-api-access-4t2sv\") pod \"nova-api-8999-account-create-xt68d\" (UID: \"806ec124-c88e-471a-891f-a76296deb62a\") " pod="openstack/nova-api-8999-account-create-xt68d" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.626943 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-config-data-custom\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.627025 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e4bd48-e5db-4d06-947f-63223788352f-logs\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.627122 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.627180 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-config-data\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.627211 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv9g6\" (UniqueName: \"kubernetes.io/projected/42e4bd48-e5db-4d06-947f-63223788352f-kube-api-access-hv9g6\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.627287 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-scripts\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.627336 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.627373 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42e4bd48-e5db-4d06-947f-63223788352f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.627425 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.627485 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbvcp\" (UniqueName: \"kubernetes.io/projected/7ca21d32-62e0-4438-85cc-5a60c3933915-kube-api-access-bbvcp\") pod \"nova-cell0-cd62-account-create-jfhjr\" (UID: \"7ca21d32-62e0-4438-85cc-5a60c3933915\") " pod="openstack/nova-cell0-cd62-account-create-jfhjr" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.627572 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42e4bd48-e5db-4d06-947f-63223788352f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.628181 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e4bd48-e5db-4d06-947f-63223788352f-logs\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.633417 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-config-data\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.633442 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-scripts\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.634576 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-config-data-custom\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.634836 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.635878 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.637750 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e4bd48-e5db-4d06-947f-63223788352f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.646868 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv9g6\" (UniqueName: \"kubernetes.io/projected/42e4bd48-e5db-4d06-947f-63223788352f-kube-api-access-hv9g6\") pod \"cinder-api-0\" (UID: \"42e4bd48-e5db-4d06-947f-63223788352f\") " pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.660879 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8999-account-create-xt68d" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.705805 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7cd3-account-create-bkdqw"] Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.711558 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7cd3-account-create-bkdqw" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.718670 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.726745 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7cd3-account-create-bkdqw"] Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.732806 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5956b78c54-6g8cs" event={"ID":"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1","Type":"ContainerStarted","Data":"c5c572413a62b29f4badf64c3ae6eaf033514a5cb78f8ba9f559becd18cdd981"} Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.732850 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5956b78c54-6g8cs" event={"ID":"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1","Type":"ContainerStarted","Data":"e04907794d90f6992cbe80466eb76bd0c1404811e1abef4072417600a1f1617f"} Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.732861 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5956b78c54-6g8cs" event={"ID":"9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1","Type":"ContainerStarted","Data":"0557d8f2d667bb3906e11496c3463ebd2a5542b38dd45e8b94d405c393a4ab2f"} Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.733330 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.733394 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.733943 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2d8r\" (UniqueName: \"kubernetes.io/projected/9de48fc0-9dd0-4a25-894e-e27bca99f97f-kube-api-access-q2d8r\") pod \"nova-cell1-7cd3-account-create-bkdqw\" (UID: \"9de48fc0-9dd0-4a25-894e-e27bca99f97f\") " pod="openstack/nova-cell1-7cd3-account-create-bkdqw" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.734479 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbvcp\" (UniqueName: \"kubernetes.io/projected/7ca21d32-62e0-4438-85cc-5a60c3933915-kube-api-access-bbvcp\") pod \"nova-cell0-cd62-account-create-jfhjr\" (UID: \"7ca21d32-62e0-4438-85cc-5a60c3933915\") " pod="openstack/nova-cell0-cd62-account-create-jfhjr" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.758016 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbvcp\" (UniqueName: \"kubernetes.io/projected/7ca21d32-62e0-4438-85cc-5a60c3933915-kube-api-access-bbvcp\") pod \"nova-cell0-cd62-account-create-jfhjr\" (UID: \"7ca21d32-62e0-4438-85cc-5a60c3933915\") " pod="openstack/nova-cell0-cd62-account-create-jfhjr" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.775646 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.841150 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2d8r\" (UniqueName: \"kubernetes.io/projected/9de48fc0-9dd0-4a25-894e-e27bca99f97f-kube-api-access-q2d8r\") pod \"nova-cell1-7cd3-account-create-bkdqw\" (UID: \"9de48fc0-9dd0-4a25-894e-e27bca99f97f\") " pod="openstack/nova-cell1-7cd3-account-create-bkdqw" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.843468 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cd62-account-create-jfhjr" Oct 02 11:15:31 crc kubenswrapper[4835]: I1002 11:15:31.886577 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2d8r\" (UniqueName: \"kubernetes.io/projected/9de48fc0-9dd0-4a25-894e-e27bca99f97f-kube-api-access-q2d8r\") pod \"nova-cell1-7cd3-account-create-bkdqw\" (UID: \"9de48fc0-9dd0-4a25-894e-e27bca99f97f\") " pod="openstack/nova-cell1-7cd3-account-create-bkdqw" Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.064612 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7cd3-account-create-bkdqw" Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.218083 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5956b78c54-6g8cs" podStartSLOduration=3.2180584899999998 podStartE2EDuration="3.21805849s" podCreationTimestamp="2025-10-02 11:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:31.8016956 +0000 UTC m=+1208.361603181" watchObservedRunningTime="2025-10-02 11:15:32.21805849 +0000 UTC m=+1208.777966061" Oct 02 11:15:32 crc kubenswrapper[4835]: W1002 11:15:32.234001 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod806ec124_c88e_471a_891f_a76296deb62a.slice/crio-747a0f2c5ec3ffec3268489d4fd87e5c865377039415205e5b0ccc03d4bd145d WatchSource:0}: Error finding container 747a0f2c5ec3ffec3268489d4fd87e5c865377039415205e5b0ccc03d4bd145d: Status 404 returned error can't find the container with id 747a0f2c5ec3ffec3268489d4fd87e5c865377039415205e5b0ccc03d4bd145d Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.239358 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8999-account-create-xt68d"] Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.288961 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f1bef34-8be8-4329-a61b-9949a5d9493f" path="/var/lib/kubelet/pods/3f1bef34-8be8-4329-a61b-9949a5d9493f/volumes" Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.492294 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cd62-account-create-jfhjr"] Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.499936 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.752264 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cd62-account-create-jfhjr" event={"ID":"7ca21d32-62e0-4438-85cc-5a60c3933915","Type":"ContainerStarted","Data":"eccc800bf11f0fc18a00e83c0427fc5e96d1571df313495648159bae7db4c0e0"} Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.754340 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8999-account-create-xt68d" event={"ID":"806ec124-c88e-471a-891f-a76296deb62a","Type":"ContainerStarted","Data":"da9588ed54e9c84dc41cd58113b37f1f640de3761ce9666c815c4c3fade6d142"} Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.754370 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8999-account-create-xt68d" event={"ID":"806ec124-c88e-471a-891f-a76296deb62a","Type":"ContainerStarted","Data":"747a0f2c5ec3ffec3268489d4fd87e5c865377039415205e5b0ccc03d4bd145d"} Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.758505 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"42e4bd48-e5db-4d06-947f-63223788352f","Type":"ContainerStarted","Data":"128fb60229d804d2998b844661ddf94698252790a70ab8d8c4cbd4bc5874dae1"} Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.780408 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-8999-account-create-xt68d" podStartSLOduration=1.780385376 podStartE2EDuration="1.780385376s" podCreationTimestamp="2025-10-02 11:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:32.771511902 +0000 UTC m=+1209.331419483" watchObservedRunningTime="2025-10-02 11:15:32.780385376 +0000 UTC m=+1209.340292957" Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.806241 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7cd3-account-create-bkdqw"] Oct 02 11:15:32 crc kubenswrapper[4835]: W1002 11:15:32.807465 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9de48fc0_9dd0_4a25_894e_e27bca99f97f.slice/crio-b4271be466b329929b1398180d01ad3dda229851943724bdb43642c595aff299 WatchSource:0}: Error finding container b4271be466b329929b1398180d01ad3dda229851943724bdb43642c595aff299: Status 404 returned error can't find the container with id b4271be466b329929b1398180d01ad3dda229851943724bdb43642c595aff299 Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.992690 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.993067 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="ceilometer-central-agent" containerID="cri-o://ebb345f1b91ce7c17307535e83371a5ffa342c531128485a73f1f6ab04b1c01c" gracePeriod=30 Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.993694 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="proxy-httpd" containerID="cri-o://796f67033b0c50666606049060c308e2500420b4c29cc42a982cabf8c25466ac" gracePeriod=30 Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.993778 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="sg-core" containerID="cri-o://a7a69a8916056cf4990fd1f397df905fafa698d5942683a1009862bcc93a5e45" gracePeriod=30 Oct 02 11:15:32 crc kubenswrapper[4835]: I1002 11:15:32.993842 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="ceilometer-notification-agent" containerID="cri-o://d1a88fa97c5d0a6cb7a9b0313a67615f960cdba5f2c29016bae59a20fc2408e4" gracePeriod=30 Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.037012 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.037243 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3de79574-cfeb-4000-881a-94f1e4e22235" containerName="kube-state-metrics" containerID="cri-o://1a1bdf6709c45dfa9e7349df3425d989560ad4c89e21fb01b0fa0de223be33a8" gracePeriod=30 Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.783479 4835 generic.go:334] "Generic (PLEG): container finished" podID="7ca21d32-62e0-4438-85cc-5a60c3933915" containerID="09ab146fade6566a9fb02d4c99966c2be19fec660d446ec9fd8cec25b4263652" exitCode=0 Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.784108 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cd62-account-create-jfhjr" event={"ID":"7ca21d32-62e0-4438-85cc-5a60c3933915","Type":"ContainerDied","Data":"09ab146fade6566a9fb02d4c99966c2be19fec660d446ec9fd8cec25b4263652"} Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.795797 4835 generic.go:334] "Generic (PLEG): container finished" podID="3de79574-cfeb-4000-881a-94f1e4e22235" containerID="1a1bdf6709c45dfa9e7349df3425d989560ad4c89e21fb01b0fa0de223be33a8" exitCode=2 Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.795873 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3de79574-cfeb-4000-881a-94f1e4e22235","Type":"ContainerDied","Data":"1a1bdf6709c45dfa9e7349df3425d989560ad4c89e21fb01b0fa0de223be33a8"} Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.824463 4835 generic.go:334] "Generic (PLEG): container finished" podID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerID="796f67033b0c50666606049060c308e2500420b4c29cc42a982cabf8c25466ac" exitCode=0 Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.824529 4835 generic.go:334] "Generic (PLEG): container finished" podID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerID="a7a69a8916056cf4990fd1f397df905fafa698d5942683a1009862bcc93a5e45" exitCode=2 Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.824645 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7","Type":"ContainerDied","Data":"796f67033b0c50666606049060c308e2500420b4c29cc42a982cabf8c25466ac"} Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.824702 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7","Type":"ContainerDied","Data":"a7a69a8916056cf4990fd1f397df905fafa698d5942683a1009862bcc93a5e45"} Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.826743 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7cd3-account-create-bkdqw" event={"ID":"9de48fc0-9dd0-4a25-894e-e27bca99f97f","Type":"ContainerStarted","Data":"0e5d8f9241c40fb080ed6dcd04f01be52e988873cc2fd40b04c9695a727fc99f"} Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.826790 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7cd3-account-create-bkdqw" event={"ID":"9de48fc0-9dd0-4a25-894e-e27bca99f97f","Type":"ContainerStarted","Data":"b4271be466b329929b1398180d01ad3dda229851943724bdb43642c595aff299"} Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.828638 4835 generic.go:334] "Generic (PLEG): container finished" podID="806ec124-c88e-471a-891f-a76296deb62a" containerID="da9588ed54e9c84dc41cd58113b37f1f640de3761ce9666c815c4c3fade6d142" exitCode=0 Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.828699 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8999-account-create-xt68d" event={"ID":"806ec124-c88e-471a-891f-a76296deb62a","Type":"ContainerDied","Data":"da9588ed54e9c84dc41cd58113b37f1f640de3761ce9666c815c4c3fade6d142"} Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.829705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"42e4bd48-e5db-4d06-947f-63223788352f","Type":"ContainerStarted","Data":"68d4c2902bca0808c248087ce6db3485deef16009e393f9064421302244e7706"} Oct 02 11:15:33 crc kubenswrapper[4835]: I1002 11:15:33.901573 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.113652 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.232559 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxjzt\" (UniqueName: \"kubernetes.io/projected/3de79574-cfeb-4000-881a-94f1e4e22235-kube-api-access-qxjzt\") pod \"3de79574-cfeb-4000-881a-94f1e4e22235\" (UID: \"3de79574-cfeb-4000-881a-94f1e4e22235\") " Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.261305 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de79574-cfeb-4000-881a-94f1e4e22235-kube-api-access-qxjzt" (OuterVolumeSpecName: "kube-api-access-qxjzt") pod "3de79574-cfeb-4000-881a-94f1e4e22235" (UID: "3de79574-cfeb-4000-881a-94f1e4e22235"). InnerVolumeSpecName "kube-api-access-qxjzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.275832 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.345431 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxjzt\" (UniqueName: \"kubernetes.io/projected/3de79574-cfeb-4000-881a-94f1e4e22235-kube-api-access-qxjzt\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.388370 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-d4ngg"] Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.388622 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" podUID="d8a71c32-05c6-4635-9e6a-3d72d59edd72" containerName="dnsmasq-dns" containerID="cri-o://9c33867b6750f2b2be57b205a9e5bb3c1480513fa54d9f30c60fe290d1b06144" gracePeriod=10 Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.581210 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.867036 4835 generic.go:334] "Generic (PLEG): container finished" podID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerID="ebb345f1b91ce7c17307535e83371a5ffa342c531128485a73f1f6ab04b1c01c" exitCode=0 Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.867542 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7","Type":"ContainerDied","Data":"ebb345f1b91ce7c17307535e83371a5ffa342c531128485a73f1f6ab04b1c01c"} Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.869068 4835 generic.go:334] "Generic (PLEG): container finished" podID="9de48fc0-9dd0-4a25-894e-e27bca99f97f" containerID="0e5d8f9241c40fb080ed6dcd04f01be52e988873cc2fd40b04c9695a727fc99f" exitCode=0 Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.869117 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7cd3-account-create-bkdqw" event={"ID":"9de48fc0-9dd0-4a25-894e-e27bca99f97f","Type":"ContainerDied","Data":"0e5d8f9241c40fb080ed6dcd04f01be52e988873cc2fd40b04c9695a727fc99f"} Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.895877 4835 generic.go:334] "Generic (PLEG): container finished" podID="d8a71c32-05c6-4635-9e6a-3d72d59edd72" containerID="9c33867b6750f2b2be57b205a9e5bb3c1480513fa54d9f30c60fe290d1b06144" exitCode=0 Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.895999 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" event={"ID":"d8a71c32-05c6-4635-9e6a-3d72d59edd72","Type":"ContainerDied","Data":"9c33867b6750f2b2be57b205a9e5bb3c1480513fa54d9f30c60fe290d1b06144"} Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.900203 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.900702 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3de79574-cfeb-4000-881a-94f1e4e22235","Type":"ContainerDied","Data":"24e334d32b113214d8521f84fdc0e912672b542505dfc4bc49f671fced98566c"} Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.900784 4835 scope.go:117] "RemoveContainer" containerID="1a1bdf6709c45dfa9e7349df3425d989560ad4c89e21fb01b0fa0de223be33a8" Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.952743 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:34 crc kubenswrapper[4835]: I1002 11:15:34.968389 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.188248 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.212042 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:15:35 crc kubenswrapper[4835]: E1002 11:15:35.213936 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de79574-cfeb-4000-881a-94f1e4e22235" containerName="kube-state-metrics" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.213971 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de79574-cfeb-4000-881a-94f1e4e22235" containerName="kube-state-metrics" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.214603 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de79574-cfeb-4000-881a-94f1e4e22235" containerName="kube-state-metrics" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.215824 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.219520 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.228554 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.230544 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.291498 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.385411 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7deec852-7067-4dfe-b052-3e385e350a93-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.385496 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7deec852-7067-4dfe-b052-3e385e350a93-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.385571 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74ntr\" (UniqueName: \"kubernetes.io/projected/7deec852-7067-4dfe-b052-3e385e350a93-kube-api-access-74ntr\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.385596 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7deec852-7067-4dfe-b052-3e385e350a93-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.489716 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-sb\") pod \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.489817 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-config\") pod \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.489944 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgbgs\" (UniqueName: \"kubernetes.io/projected/d8a71c32-05c6-4635-9e6a-3d72d59edd72-kube-api-access-dgbgs\") pod \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.489973 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-nb\") pod \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.490041 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-dns-svc\") pod \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\" (UID: \"d8a71c32-05c6-4635-9e6a-3d72d59edd72\") " Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.490514 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7deec852-7067-4dfe-b052-3e385e350a93-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.490583 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7deec852-7067-4dfe-b052-3e385e350a93-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.490689 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74ntr\" (UniqueName: \"kubernetes.io/projected/7deec852-7067-4dfe-b052-3e385e350a93-kube-api-access-74ntr\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.490750 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7deec852-7067-4dfe-b052-3e385e350a93-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.502035 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7deec852-7067-4dfe-b052-3e385e350a93-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.519057 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7deec852-7067-4dfe-b052-3e385e350a93-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.523965 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a71c32-05c6-4635-9e6a-3d72d59edd72-kube-api-access-dgbgs" (OuterVolumeSpecName: "kube-api-access-dgbgs") pod "d8a71c32-05c6-4635-9e6a-3d72d59edd72" (UID: "d8a71c32-05c6-4635-9e6a-3d72d59edd72"). InnerVolumeSpecName "kube-api-access-dgbgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.540853 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7deec852-7067-4dfe-b052-3e385e350a93-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.549072 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74ntr\" (UniqueName: \"kubernetes.io/projected/7deec852-7067-4dfe-b052-3e385e350a93-kube-api-access-74ntr\") pod \"kube-state-metrics-0\" (UID: \"7deec852-7067-4dfe-b052-3e385e350a93\") " pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.576269 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7cd3-account-create-bkdqw" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.597162 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.602263 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2d8r\" (UniqueName: \"kubernetes.io/projected/9de48fc0-9dd0-4a25-894e-e27bca99f97f-kube-api-access-q2d8r\") pod \"9de48fc0-9dd0-4a25-894e-e27bca99f97f\" (UID: \"9de48fc0-9dd0-4a25-894e-e27bca99f97f\") " Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.602809 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgbgs\" (UniqueName: \"kubernetes.io/projected/d8a71c32-05c6-4635-9e6a-3d72d59edd72-kube-api-access-dgbgs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.629447 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de48fc0-9dd0-4a25-894e-e27bca99f97f-kube-api-access-q2d8r" (OuterVolumeSpecName: "kube-api-access-q2d8r") pod "9de48fc0-9dd0-4a25-894e-e27bca99f97f" (UID: "9de48fc0-9dd0-4a25-894e-e27bca99f97f"). InnerVolumeSpecName "kube-api-access-q2d8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.630653 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8a71c32-05c6-4635-9e6a-3d72d59edd72" (UID: "d8a71c32-05c6-4635-9e6a-3d72d59edd72"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.661089 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8999-account-create-xt68d" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.677083 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8a71c32-05c6-4635-9e6a-3d72d59edd72" (UID: "d8a71c32-05c6-4635-9e6a-3d72d59edd72"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.708471 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-config" (OuterVolumeSpecName: "config") pod "d8a71c32-05c6-4635-9e6a-3d72d59edd72" (UID: "d8a71c32-05c6-4635-9e6a-3d72d59edd72"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.710333 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.710367 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2d8r\" (UniqueName: \"kubernetes.io/projected/9de48fc0-9dd0-4a25-894e-e27bca99f97f-kube-api-access-q2d8r\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.710380 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.710390 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.759476 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8a71c32-05c6-4635-9e6a-3d72d59edd72" (UID: "d8a71c32-05c6-4635-9e6a-3d72d59edd72"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.820721 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t2sv\" (UniqueName: \"kubernetes.io/projected/806ec124-c88e-471a-891f-a76296deb62a-kube-api-access-4t2sv\") pod \"806ec124-c88e-471a-891f-a76296deb62a\" (UID: \"806ec124-c88e-471a-891f-a76296deb62a\") " Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.821117 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a71c32-05c6-4635-9e6a-3d72d59edd72-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.827059 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/806ec124-c88e-471a-891f-a76296deb62a-kube-api-access-4t2sv" (OuterVolumeSpecName: "kube-api-access-4t2sv") pod "806ec124-c88e-471a-891f-a76296deb62a" (UID: "806ec124-c88e-471a-891f-a76296deb62a"). InnerVolumeSpecName "kube-api-access-4t2sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.927034 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t2sv\" (UniqueName: \"kubernetes.io/projected/806ec124-c88e-471a-891f-a76296deb62a-kube-api-access-4t2sv\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.943504 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7cd3-account-create-bkdqw" event={"ID":"9de48fc0-9dd0-4a25-894e-e27bca99f97f","Type":"ContainerDied","Data":"b4271be466b329929b1398180d01ad3dda229851943724bdb43642c595aff299"} Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.944020 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4271be466b329929b1398180d01ad3dda229851943724bdb43642c595aff299" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.944111 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7cd3-account-create-bkdqw" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.958484 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8999-account-create-xt68d" event={"ID":"806ec124-c88e-471a-891f-a76296deb62a","Type":"ContainerDied","Data":"747a0f2c5ec3ffec3268489d4fd87e5c865377039415205e5b0ccc03d4bd145d"} Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.958534 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="747a0f2c5ec3ffec3268489d4fd87e5c865377039415205e5b0ccc03d4bd145d" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.958598 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8999-account-create-xt68d" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.973596 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"42e4bd48-e5db-4d06-947f-63223788352f","Type":"ContainerStarted","Data":"ae611c0063e43dcfc7b557fd4ecbfda689b8b8da1afba23e22bd2683c8ef5966"} Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.975212 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.987940 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" event={"ID":"d8a71c32-05c6-4635-9e6a-3d72d59edd72","Type":"ContainerDied","Data":"1ce99fd53c32b01e37ee8998f4817e43e69b4c90bd979e6323ac6ce1f20cdec4"} Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.988005 4835 scope.go:117] "RemoveContainer" containerID="9c33867b6750f2b2be57b205a9e5bb3c1480513fa54d9f30c60fe290d1b06144" Oct 02 11:15:35 crc kubenswrapper[4835]: I1002 11:15:35.988166 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-d4ngg" Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.041894 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.041877136 podStartE2EDuration="5.041877136s" podCreationTimestamp="2025-10-02 11:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:36.018668221 +0000 UTC m=+1212.578575802" watchObservedRunningTime="2025-10-02 11:15:36.041877136 +0000 UTC m=+1212.601784717" Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.100795 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" containerName="cinder-scheduler" containerID="cri-o://4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8" gracePeriod=30 Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.100939 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cd62-account-create-jfhjr" event={"ID":"7ca21d32-62e0-4438-85cc-5a60c3933915","Type":"ContainerDied","Data":"eccc800bf11f0fc18a00e83c0427fc5e96d1571df313495648159bae7db4c0e0"} Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.100969 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eccc800bf11f0fc18a00e83c0427fc5e96d1571df313495648159bae7db4c0e0" Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.101254 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" containerName="probe" containerID="cri-o://dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847" gracePeriod=30 Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.240953 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:15:36 crc kubenswrapper[4835]: W1002 11:15:36.323045 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7deec852_7067_4dfe_b052_3e385e350a93.slice/crio-f57417cda228c7ae188ba734ea7a8c598d0810793ce9d2195e2ab01ac8c0b12a WatchSource:0}: Error finding container f57417cda228c7ae188ba734ea7a8c598d0810793ce9d2195e2ab01ac8c0b12a: Status 404 returned error can't find the container with id f57417cda228c7ae188ba734ea7a8c598d0810793ce9d2195e2ab01ac8c0b12a Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.329857 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de79574-cfeb-4000-881a-94f1e4e22235" path="/var/lib/kubelet/pods/3de79574-cfeb-4000-881a-94f1e4e22235/volumes" Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.373513 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cd62-account-create-jfhjr" Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.396641 4835 scope.go:117] "RemoveContainer" containerID="935cd6c7e6e10a88cf01b25936f0cd81cc4a73e79043dc360e73b31005df8822" Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.448815 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-d4ngg"] Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.460035 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-d4ngg"] Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.541347 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbvcp\" (UniqueName: \"kubernetes.io/projected/7ca21d32-62e0-4438-85cc-5a60c3933915-kube-api-access-bbvcp\") pod \"7ca21d32-62e0-4438-85cc-5a60c3933915\" (UID: \"7ca21d32-62e0-4438-85cc-5a60c3933915\") " Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.547882 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca21d32-62e0-4438-85cc-5a60c3933915-kube-api-access-bbvcp" (OuterVolumeSpecName: "kube-api-access-bbvcp") pod "7ca21d32-62e0-4438-85cc-5a60c3933915" (UID: "7ca21d32-62e0-4438-85cc-5a60c3933915"). InnerVolumeSpecName "kube-api-access-bbvcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.620629 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.643761 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbvcp\" (UniqueName: \"kubernetes.io/projected/7ca21d32-62e0-4438-85cc-5a60c3933915-kube-api-access-bbvcp\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:36 crc kubenswrapper[4835]: I1002 11:15:36.785768 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:37 crc kubenswrapper[4835]: I1002 11:15:37.121195 4835 generic.go:334] "Generic (PLEG): container finished" podID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerID="d1a88fa97c5d0a6cb7a9b0313a67615f960cdba5f2c29016bae59a20fc2408e4" exitCode=0 Oct 02 11:15:37 crc kubenswrapper[4835]: I1002 11:15:37.121280 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7","Type":"ContainerDied","Data":"d1a88fa97c5d0a6cb7a9b0313a67615f960cdba5f2c29016bae59a20fc2408e4"} Oct 02 11:15:37 crc kubenswrapper[4835]: I1002 11:15:37.123585 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7deec852-7067-4dfe-b052-3e385e350a93","Type":"ContainerStarted","Data":"f57417cda228c7ae188ba734ea7a8c598d0810793ce9d2195e2ab01ac8c0b12a"} Oct 02 11:15:37 crc kubenswrapper[4835]: I1002 11:15:37.129956 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cd62-account-create-jfhjr" Oct 02 11:15:37 crc kubenswrapper[4835]: I1002 11:15:37.934844 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.076542 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-log-httpd\") pod \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.076604 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-config-data\") pod \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.076629 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-sg-core-conf-yaml\") pod \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.076686 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-scripts\") pod \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.076715 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-run-httpd\") pod \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.076744 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j2hx\" (UniqueName: \"kubernetes.io/projected/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-kube-api-access-9j2hx\") pod \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.076840 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-combined-ca-bundle\") pod \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\" (UID: \"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7\") " Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.077470 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" (UID: "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.077553 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" (UID: "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.083565 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-scripts" (OuterVolumeSpecName: "scripts") pod "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" (UID: "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.108632 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-kube-api-access-9j2hx" (OuterVolumeSpecName: "kube-api-access-9j2hx") pod "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" (UID: "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7"). InnerVolumeSpecName "kube-api-access-9j2hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.159324 4835 generic.go:334] "Generic (PLEG): container finished" podID="8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" containerID="dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847" exitCode=0 Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.159426 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0","Type":"ContainerDied","Data":"dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847"} Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.167616 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50af4b0c-a462-4d15-ba2b-8ce07e9db0b7","Type":"ContainerDied","Data":"2497d86137fbad3d073a9f7a2e30bcc7181c495342e6bfc8f7e706be609cdf22"} Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.167693 4835 scope.go:117] "RemoveContainer" containerID="796f67033b0c50666606049060c308e2500420b4c29cc42a982cabf8c25466ac" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.167882 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.184169 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.186012 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.186106 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.186179 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j2hx\" (UniqueName: \"kubernetes.io/projected/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-kube-api-access-9j2hx\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.184934 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7deec852-7067-4dfe-b052-3e385e350a93","Type":"ContainerStarted","Data":"ed87f95091388afe309fbfd6d7b6f82abd8dcda588e2f87e29990d25dc7415e2"} Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.186382 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.194616 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" (UID: "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.211408 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.302718583 podStartE2EDuration="4.211381721s" podCreationTimestamp="2025-10-02 11:15:34 +0000 UTC" firstStartedPulling="2025-10-02 11:15:36.329041931 +0000 UTC m=+1212.888949512" lastFinishedPulling="2025-10-02 11:15:37.237705069 +0000 UTC m=+1213.797612650" observedRunningTime="2025-10-02 11:15:38.206415869 +0000 UTC m=+1214.766323450" watchObservedRunningTime="2025-10-02 11:15:38.211381721 +0000 UTC m=+1214.771289302" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.212468 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" (UID: "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.229349 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-config-data" (OuterVolumeSpecName: "config-data") pod "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" (UID: "50af4b0c-a462-4d15-ba2b-8ce07e9db0b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.264541 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a71c32-05c6-4635-9e6a-3d72d59edd72" path="/var/lib/kubelet/pods/d8a71c32-05c6-4635-9e6a-3d72d59edd72/volumes" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.288375 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.288417 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.288430 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.304977 4835 scope.go:117] "RemoveContainer" containerID="a7a69a8916056cf4990fd1f397df905fafa698d5942683a1009862bcc93a5e45" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.330753 4835 scope.go:117] "RemoveContainer" containerID="d1a88fa97c5d0a6cb7a9b0313a67615f960cdba5f2c29016bae59a20fc2408e4" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.358404 4835 scope.go:117] "RemoveContainer" containerID="ebb345f1b91ce7c17307535e83371a5ffa342c531128485a73f1f6ab04b1c01c" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.491714 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.506729 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.535137 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:38 crc kubenswrapper[4835]: E1002 11:15:38.535656 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806ec124-c88e-471a-891f-a76296deb62a" containerName="mariadb-account-create" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.535689 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="806ec124-c88e-471a-891f-a76296deb62a" containerName="mariadb-account-create" Oct 02 11:15:38 crc kubenswrapper[4835]: E1002 11:15:38.535712 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="sg-core" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.535721 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="sg-core" Oct 02 11:15:38 crc kubenswrapper[4835]: E1002 11:15:38.535738 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="ceilometer-central-agent" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.535747 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="ceilometer-central-agent" Oct 02 11:15:38 crc kubenswrapper[4835]: E1002 11:15:38.535786 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de48fc0-9dd0-4a25-894e-e27bca99f97f" containerName="mariadb-account-create" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.535797 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de48fc0-9dd0-4a25-894e-e27bca99f97f" containerName="mariadb-account-create" Oct 02 11:15:38 crc kubenswrapper[4835]: E1002 11:15:38.535809 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="proxy-httpd" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.535820 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="proxy-httpd" Oct 02 11:15:38 crc kubenswrapper[4835]: E1002 11:15:38.535843 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a71c32-05c6-4635-9e6a-3d72d59edd72" containerName="dnsmasq-dns" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.535855 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a71c32-05c6-4635-9e6a-3d72d59edd72" containerName="dnsmasq-dns" Oct 02 11:15:38 crc kubenswrapper[4835]: E1002 11:15:38.535870 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="ceilometer-notification-agent" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.535883 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="ceilometer-notification-agent" Oct 02 11:15:38 crc kubenswrapper[4835]: E1002 11:15:38.535924 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca21d32-62e0-4438-85cc-5a60c3933915" containerName="mariadb-account-create" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.535934 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca21d32-62e0-4438-85cc-5a60c3933915" containerName="mariadb-account-create" Oct 02 11:15:38 crc kubenswrapper[4835]: E1002 11:15:38.535961 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a71c32-05c6-4635-9e6a-3d72d59edd72" containerName="init" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.535970 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a71c32-05c6-4635-9e6a-3d72d59edd72" containerName="init" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.536191 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="806ec124-c88e-471a-891f-a76296deb62a" containerName="mariadb-account-create" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.536239 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="proxy-httpd" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.536262 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca21d32-62e0-4438-85cc-5a60c3933915" containerName="mariadb-account-create" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.536277 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="sg-core" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.536294 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="ceilometer-central-agent" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.536309 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" containerName="ceilometer-notification-agent" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.536321 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de48fc0-9dd0-4a25-894e-e27bca99f97f" containerName="mariadb-account-create" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.536334 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a71c32-05c6-4635-9e6a-3d72d59edd72" containerName="dnsmasq-dns" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.538484 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.541436 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.541719 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.544611 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.566335 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.695048 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-scripts\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.695290 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-log-httpd\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.695327 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.695401 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.695490 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmx8m\" (UniqueName: \"kubernetes.io/projected/14ec883d-a110-4e19-ba6d-7435b5e76cbc-kube-api-access-pmx8m\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.695565 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-run-httpd\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.695665 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.695800 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-config-data\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.797083 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmx8m\" (UniqueName: \"kubernetes.io/projected/14ec883d-a110-4e19-ba6d-7435b5e76cbc-kube-api-access-pmx8m\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.797203 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-run-httpd\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.797254 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.797285 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-config-data\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.797350 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-scripts\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.797386 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-log-httpd\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.797404 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.797443 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.798515 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-log-httpd\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.798523 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-run-httpd\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.802294 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.804582 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.804944 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.804940 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-scripts\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.805632 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-config-data\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.817802 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmx8m\" (UniqueName: \"kubernetes.io/projected/14ec883d-a110-4e19-ba6d-7435b5e76cbc-kube-api-access-pmx8m\") pod \"ceilometer-0\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " pod="openstack/ceilometer-0" Oct 02 11:15:38 crc kubenswrapper[4835]: I1002 11:15:38.870024 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:39 crc kubenswrapper[4835]: I1002 11:15:39.447490 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:39 crc kubenswrapper[4835]: W1002 11:15:39.496167 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ec883d_a110_4e19_ba6d_7435b5e76cbc.slice/crio-f0e0776bade9db3fb3ff46477f6c9f9d0dad5712a232d70b03be3a34a576e2e2 WatchSource:0}: Error finding container f0e0776bade9db3fb3ff46477f6c9f9d0dad5712a232d70b03be3a34a576e2e2: Status 404 returned error can't find the container with id f0e0776bade9db3fb3ff46477f6c9f9d0dad5712a232d70b03be3a34a576e2e2 Oct 02 11:15:40 crc kubenswrapper[4835]: I1002 11:15:40.241750 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec883d-a110-4e19-ba6d-7435b5e76cbc","Type":"ContainerStarted","Data":"f0e0776bade9db3fb3ff46477f6c9f9d0dad5712a232d70b03be3a34a576e2e2"} Oct 02 11:15:40 crc kubenswrapper[4835]: I1002 11:15:40.268558 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50af4b0c-a462-4d15-ba2b-8ce07e9db0b7" path="/var/lib/kubelet/pods/50af4b0c-a462-4d15-ba2b-8ce07e9db0b7/volumes" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.193173 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.278196 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-etc-machine-id\") pod \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.278334 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data\") pod \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.278437 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-combined-ca-bundle\") pod \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.278516 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data-custom\") pod \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.278600 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9v7w\" (UniqueName: \"kubernetes.io/projected/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-kube-api-access-g9v7w\") pod \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.278648 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-scripts\") pod \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\" (UID: \"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0\") " Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.279465 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" (UID: "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.287510 4835 generic.go:334] "Generic (PLEG): container finished" podID="8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" containerID="4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8" exitCode=0 Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.287584 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0","Type":"ContainerDied","Data":"4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8"} Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.287615 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8b18e172-ad4a-45e4-9a5b-40e89d8f48d0","Type":"ContainerDied","Data":"697160650c044cad114ac9e748180dc38410f1ee8be2fd988374974b05884166"} Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.287633 4835 scope.go:117] "RemoveContainer" containerID="dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.287779 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.309101 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-scripts" (OuterVolumeSpecName: "scripts") pod "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" (UID: "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.310327 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" (UID: "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.320098 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-kube-api-access-g9v7w" (OuterVolumeSpecName: "kube-api-access-g9v7w") pod "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" (UID: "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0"). InnerVolumeSpecName "kube-api-access-g9v7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.320922 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec883d-a110-4e19-ba6d-7435b5e76cbc","Type":"ContainerStarted","Data":"f7a09d1b2320770b3ca6bb8c61bfa4ad16222f35ccb9ee77db469b6f6d0a6f12"} Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.382916 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.382978 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9v7w\" (UniqueName: \"kubernetes.io/projected/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-kube-api-access-g9v7w\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.382995 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.383010 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.393708 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" (UID: "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.407630 4835 scope.go:117] "RemoveContainer" containerID="4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.432117 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data" (OuterVolumeSpecName: "config-data") pod "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" (UID: "8b18e172-ad4a-45e4-9a5b-40e89d8f48d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.450940 4835 scope.go:117] "RemoveContainer" containerID="dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847" Oct 02 11:15:41 crc kubenswrapper[4835]: E1002 11:15:41.454551 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847\": container with ID starting with dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847 not found: ID does not exist" containerID="dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.454622 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847"} err="failed to get container status \"dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847\": rpc error: code = NotFound desc = could not find container \"dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847\": container with ID starting with dbec99e13de68072cedd367356f7a8b23f5bde9aedcfbd4bd885f6b302a57847 not found: ID does not exist" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.454655 4835 scope.go:117] "RemoveContainer" containerID="4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8" Oct 02 11:15:41 crc kubenswrapper[4835]: E1002 11:15:41.455447 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8\": container with ID starting with 4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8 not found: ID does not exist" containerID="4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.455466 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8"} err="failed to get container status \"4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8\": rpc error: code = NotFound desc = could not find container \"4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8\": container with ID starting with 4d4b0d3e1733053f6a5bf332432823c71080d0f87524b9a354715c23d3f9e6f8 not found: ID does not exist" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.484587 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.484650 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.630509 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.670563 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.680470 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:41 crc kubenswrapper[4835]: E1002 11:15:41.681070 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" containerName="cinder-scheduler" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.681092 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" containerName="cinder-scheduler" Oct 02 11:15:41 crc kubenswrapper[4835]: E1002 11:15:41.681114 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" containerName="probe" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.681123 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" containerName="probe" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.681366 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" containerName="probe" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.681395 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" containerName="cinder-scheduler" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.683882 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.687462 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.713509 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.792030 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.792092 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mhh9\" (UniqueName: \"kubernetes.io/projected/aaa8b6c2-6884-4f83-bb0b-a866426ca426-kube-api-access-9mhh9\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.792208 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-config-data\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.792295 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-scripts\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.792545 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.792588 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aaa8b6c2-6884-4f83-bb0b-a866426ca426-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.814157 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7qhlf"] Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.819921 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.824305 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ltbl8" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.825821 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.828444 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7qhlf"] Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.829805 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.895508 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-config-data\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.895574 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74fst\" (UniqueName: \"kubernetes.io/projected/ff2c9e49-6700-488d-bbcd-46812b7bf134-kube-api-access-74fst\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.895638 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-config-data\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.895658 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-scripts\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.895680 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-scripts\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.895749 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.895807 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.895832 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aaa8b6c2-6884-4f83-bb0b-a866426ca426-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.895949 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.895976 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mhh9\" (UniqueName: \"kubernetes.io/projected/aaa8b6c2-6884-4f83-bb0b-a866426ca426-kube-api-access-9mhh9\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.897536 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aaa8b6c2-6884-4f83-bb0b-a866426ca426-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.908336 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.911420 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-scripts\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.912254 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-config-data\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.916856 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa8b6c2-6884-4f83-bb0b-a866426ca426-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:41 crc kubenswrapper[4835]: I1002 11:15:41.938584 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mhh9\" (UniqueName: \"kubernetes.io/projected/aaa8b6c2-6884-4f83-bb0b-a866426ca426-kube-api-access-9mhh9\") pod \"cinder-scheduler-0\" (UID: \"aaa8b6c2-6884-4f83-bb0b-a866426ca426\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.000994 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74fst\" (UniqueName: \"kubernetes.io/projected/ff2c9e49-6700-488d-bbcd-46812b7bf134-kube-api-access-74fst\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.001087 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-config-data\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.001143 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-scripts\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.001206 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.003373 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.003474 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.009411 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.009827 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-config-data\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.013772 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-scripts\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.040626 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74fst\" (UniqueName: \"kubernetes.io/projected/ff2c9e49-6700-488d-bbcd-46812b7bf134-kube-api-access-74fst\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.055451 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7qhlf\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.174415 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.264819 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b18e172-ad4a-45e4-9a5b-40e89d8f48d0" path="/var/lib/kubelet/pods/8b18e172-ad4a-45e4-9a5b-40e89d8f48d0/volumes" Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.668122 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.900918 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7qhlf"] Oct 02 11:15:42 crc kubenswrapper[4835]: I1002 11:15:42.972698 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:43 crc kubenswrapper[4835]: I1002 11:15:43.224692 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5956b78c54-6g8cs" Oct 02 11:15:43 crc kubenswrapper[4835]: I1002 11:15:43.325417 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b69ffbbb-ht5c9"] Oct 02 11:15:43 crc kubenswrapper[4835]: I1002 11:15:43.343630 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b69ffbbb-ht5c9" podUID="813d1ab2-14f3-4726-9b9a-3fe1be2a1395" containerName="barbican-api" containerID="cri-o://f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4" gracePeriod=30 Oct 02 11:15:43 crc kubenswrapper[4835]: I1002 11:15:43.343634 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b69ffbbb-ht5c9" podUID="813d1ab2-14f3-4726-9b9a-3fe1be2a1395" containerName="barbican-api-log" containerID="cri-o://02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe" gracePeriod=30 Oct 02 11:15:43 crc kubenswrapper[4835]: I1002 11:15:43.397847 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aaa8b6c2-6884-4f83-bb0b-a866426ca426","Type":"ContainerStarted","Data":"06a8c4cc471196af7d5ad512e641b59f40f30fb553cb29da447e684bc2a92fca"} Oct 02 11:15:43 crc kubenswrapper[4835]: I1002 11:15:43.401682 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec883d-a110-4e19-ba6d-7435b5e76cbc","Type":"ContainerStarted","Data":"d3a9b5b1ab09c6e811c0aaec9bc9f5da1191ea52cab6dc1f1285910cc39a5fbd"} Oct 02 11:15:43 crc kubenswrapper[4835]: I1002 11:15:43.408241 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7qhlf" event={"ID":"ff2c9e49-6700-488d-bbcd-46812b7bf134","Type":"ContainerStarted","Data":"4d40ca68c2ad41b456a6455a9a22f83dad442354c1d06ec11c438bc5c4b693f0"} Oct 02 11:15:44 crc kubenswrapper[4835]: I1002 11:15:44.463796 4835 generic.go:334] "Generic (PLEG): container finished" podID="813d1ab2-14f3-4726-9b9a-3fe1be2a1395" containerID="02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe" exitCode=143 Oct 02 11:15:44 crc kubenswrapper[4835]: I1002 11:15:44.464165 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b69ffbbb-ht5c9" event={"ID":"813d1ab2-14f3-4726-9b9a-3fe1be2a1395","Type":"ContainerDied","Data":"02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe"} Oct 02 11:15:44 crc kubenswrapper[4835]: I1002 11:15:44.473623 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aaa8b6c2-6884-4f83-bb0b-a866426ca426","Type":"ContainerStarted","Data":"43cffac8d343e8b03cfa0bb956da0cc08b3eaf3aec3fe0354cf5516f53fbb1ca"} Oct 02 11:15:44 crc kubenswrapper[4835]: I1002 11:15:44.494837 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec883d-a110-4e19-ba6d-7435b5e76cbc","Type":"ContainerStarted","Data":"ffb015ca858d0b3adb9b5ca9dbc2bd9e99ee48665270195bc3eff0d1db826f6f"} Oct 02 11:15:45 crc kubenswrapper[4835]: I1002 11:15:45.320143 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 11:15:45 crc kubenswrapper[4835]: I1002 11:15:45.553826 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aaa8b6c2-6884-4f83-bb0b-a866426ca426","Type":"ContainerStarted","Data":"82567d2d11a36f919dfc050b19c59745ecff1497e09797066e5b2a9bf7ba6e07"} Oct 02 11:15:45 crc kubenswrapper[4835]: I1002 11:15:45.590018 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec883d-a110-4e19-ba6d-7435b5e76cbc","Type":"ContainerStarted","Data":"b791aabfc9ea9db8db624714dc3c71430bc6c7614233fa40ae4bfd6c2ac63e4c"} Oct 02 11:15:45 crc kubenswrapper[4835]: I1002 11:15:45.592037 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:15:45 crc kubenswrapper[4835]: I1002 11:15:45.595825 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.595791724 podStartE2EDuration="4.595791724s" podCreationTimestamp="2025-10-02 11:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:45.581326219 +0000 UTC m=+1222.141233800" watchObservedRunningTime="2025-10-02 11:15:45.595791724 +0000 UTC m=+1222.155699305" Oct 02 11:15:45 crc kubenswrapper[4835]: I1002 11:15:45.620466 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 11:15:45 crc kubenswrapper[4835]: I1002 11:15:45.634724 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.195452059 podStartE2EDuration="7.63470142s" podCreationTimestamp="2025-10-02 11:15:38 +0000 UTC" firstStartedPulling="2025-10-02 11:15:39.500208861 +0000 UTC m=+1216.060116442" lastFinishedPulling="2025-10-02 11:15:44.939458222 +0000 UTC m=+1221.499365803" observedRunningTime="2025-10-02 11:15:45.62318166 +0000 UTC m=+1222.183089251" watchObservedRunningTime="2025-10-02 11:15:45.63470142 +0000 UTC m=+1222.194609001" Oct 02 11:15:45 crc kubenswrapper[4835]: I1002 11:15:45.781464 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="42e4bd48-e5db-4d06-947f-63223788352f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.156:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.010815 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.312691 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.435266 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data-custom\") pod \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.435648 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-logs\") pod \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.435733 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6g8v\" (UniqueName: \"kubernetes.io/projected/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-kube-api-access-m6g8v\") pod \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.435909 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data\") pod \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.435972 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-combined-ca-bundle\") pod \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\" (UID: \"813d1ab2-14f3-4726-9b9a-3fe1be2a1395\") " Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.446018 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-logs" (OuterVolumeSpecName: "logs") pod "813d1ab2-14f3-4726-9b9a-3fe1be2a1395" (UID: "813d1ab2-14f3-4726-9b9a-3fe1be2a1395"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.466381 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "813d1ab2-14f3-4726-9b9a-3fe1be2a1395" (UID: "813d1ab2-14f3-4726-9b9a-3fe1be2a1395"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.479544 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-kube-api-access-m6g8v" (OuterVolumeSpecName: "kube-api-access-m6g8v") pod "813d1ab2-14f3-4726-9b9a-3fe1be2a1395" (UID: "813d1ab2-14f3-4726-9b9a-3fe1be2a1395"). InnerVolumeSpecName "kube-api-access-m6g8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.517391 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "813d1ab2-14f3-4726-9b9a-3fe1be2a1395" (UID: "813d1ab2-14f3-4726-9b9a-3fe1be2a1395"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.537914 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.537942 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6g8v\" (UniqueName: \"kubernetes.io/projected/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-kube-api-access-m6g8v\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.537954 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.537963 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.572553 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data" (OuterVolumeSpecName: "config-data") pod "813d1ab2-14f3-4726-9b9a-3fe1be2a1395" (UID: "813d1ab2-14f3-4726-9b9a-3fe1be2a1395"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.632457 4835 generic.go:334] "Generic (PLEG): container finished" podID="813d1ab2-14f3-4726-9b9a-3fe1be2a1395" containerID="f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4" exitCode=0 Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.634056 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b69ffbbb-ht5c9" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.635328 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b69ffbbb-ht5c9" event={"ID":"813d1ab2-14f3-4726-9b9a-3fe1be2a1395","Type":"ContainerDied","Data":"f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4"} Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.635407 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b69ffbbb-ht5c9" event={"ID":"813d1ab2-14f3-4726-9b9a-3fe1be2a1395","Type":"ContainerDied","Data":"c54ec2f27569fb4e84b5015404155a0f96745d34fbea4f63ca42bdbabbd0a43d"} Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.635432 4835 scope.go:117] "RemoveContainer" containerID="f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.640958 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813d1ab2-14f3-4726-9b9a-3fe1be2a1395-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.758285 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b69ffbbb-ht5c9"] Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.763422 4835 scope.go:117] "RemoveContainer" containerID="02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.767889 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7b69ffbbb-ht5c9"] Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.805128 4835 scope.go:117] "RemoveContainer" containerID="f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4" Oct 02 11:15:47 crc kubenswrapper[4835]: E1002 11:15:47.806020 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4\": container with ID starting with f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4 not found: ID does not exist" containerID="f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.806102 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4"} err="failed to get container status \"f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4\": rpc error: code = NotFound desc = could not find container \"f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4\": container with ID starting with f0677249865cd6ca06fc39430e412f198a8b3f231926c9199ad515deeeb927e4 not found: ID does not exist" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.806129 4835 scope.go:117] "RemoveContainer" containerID="02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe" Oct 02 11:15:47 crc kubenswrapper[4835]: E1002 11:15:47.806851 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe\": container with ID starting with 02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe not found: ID does not exist" containerID="02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe" Oct 02 11:15:47 crc kubenswrapper[4835]: I1002 11:15:47.806878 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe"} err="failed to get container status \"02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe\": rpc error: code = NotFound desc = could not find container \"02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe\": container with ID starting with 02213df383a385adc7dd1dd1696b195944b2b9b9a04bec317c713adfb6f498fe not found: ID does not exist" Oct 02 11:15:48 crc kubenswrapper[4835]: I1002 11:15:48.262401 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="813d1ab2-14f3-4726-9b9a-3fe1be2a1395" path="/var/lib/kubelet/pods/813d1ab2-14f3-4726-9b9a-3fe1be2a1395/volumes" Oct 02 11:15:52 crc kubenswrapper[4835]: I1002 11:15:52.271739 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 11:15:56 crc kubenswrapper[4835]: I1002 11:15:56.778703 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7qhlf" event={"ID":"ff2c9e49-6700-488d-bbcd-46812b7bf134","Type":"ContainerStarted","Data":"acc301e91bc70fe750761770f498da7d676fe40a139bbfad72da3abe10357c37"} Oct 02 11:15:56 crc kubenswrapper[4835]: I1002 11:15:56.802165 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-7qhlf" podStartSLOduration=3.035106715 podStartE2EDuration="15.802140779s" podCreationTimestamp="2025-10-02 11:15:41 +0000 UTC" firstStartedPulling="2025-10-02 11:15:42.973916157 +0000 UTC m=+1219.533823738" lastFinishedPulling="2025-10-02 11:15:55.740950221 +0000 UTC m=+1232.300857802" observedRunningTime="2025-10-02 11:15:56.797389502 +0000 UTC m=+1233.357297083" watchObservedRunningTime="2025-10-02 11:15:56.802140779 +0000 UTC m=+1233.362048360" Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.247539 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.248119 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="ceilometer-central-agent" containerID="cri-o://f7a09d1b2320770b3ca6bb8c61bfa4ad16222f35ccb9ee77db469b6f6d0a6f12" gracePeriod=30 Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.248263 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="proxy-httpd" containerID="cri-o://b791aabfc9ea9db8db624714dc3c71430bc6c7614233fa40ae4bfd6c2ac63e4c" gracePeriod=30 Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.248304 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="ceilometer-notification-agent" containerID="cri-o://d3a9b5b1ab09c6e811c0aaec9bc9f5da1191ea52cab6dc1f1285910cc39a5fbd" gracePeriod=30 Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.248263 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="sg-core" containerID="cri-o://ffb015ca858d0b3adb9b5ca9dbc2bd9e99ee48665270195bc3eff0d1db826f6f" gracePeriod=30 Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.258437 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.160:3000/\": read tcp 10.217.0.2:47404->10.217.0.160:3000: read: connection reset by peer" Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.830395 4835 generic.go:334] "Generic (PLEG): container finished" podID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerID="b791aabfc9ea9db8db624714dc3c71430bc6c7614233fa40ae4bfd6c2ac63e4c" exitCode=0 Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.830621 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec883d-a110-4e19-ba6d-7435b5e76cbc","Type":"ContainerDied","Data":"b791aabfc9ea9db8db624714dc3c71430bc6c7614233fa40ae4bfd6c2ac63e4c"} Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.830714 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec883d-a110-4e19-ba6d-7435b5e76cbc","Type":"ContainerDied","Data":"ffb015ca858d0b3adb9b5ca9dbc2bd9e99ee48665270195bc3eff0d1db826f6f"} Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.830666 4835 generic.go:334] "Generic (PLEG): container finished" podID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerID="ffb015ca858d0b3adb9b5ca9dbc2bd9e99ee48665270195bc3eff0d1db826f6f" exitCode=2 Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.830760 4835 generic.go:334] "Generic (PLEG): container finished" podID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerID="f7a09d1b2320770b3ca6bb8c61bfa4ad16222f35ccb9ee77db469b6f6d0a6f12" exitCode=0 Oct 02 11:16:01 crc kubenswrapper[4835]: I1002 11:16:01.830792 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec883d-a110-4e19-ba6d-7435b5e76cbc","Type":"ContainerDied","Data":"f7a09d1b2320770b3ca6bb8c61bfa4ad16222f35ccb9ee77db469b6f6d0a6f12"} Oct 02 11:16:02 crc kubenswrapper[4835]: I1002 11:16:02.846520 4835 generic.go:334] "Generic (PLEG): container finished" podID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerID="d3a9b5b1ab09c6e811c0aaec9bc9f5da1191ea52cab6dc1f1285910cc39a5fbd" exitCode=0 Oct 02 11:16:02 crc kubenswrapper[4835]: I1002 11:16:02.846684 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec883d-a110-4e19-ba6d-7435b5e76cbc","Type":"ContainerDied","Data":"d3a9b5b1ab09c6e811c0aaec9bc9f5da1191ea52cab6dc1f1285910cc39a5fbd"} Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.011894 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.179830 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-sg-core-conf-yaml\") pod \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.179923 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-run-httpd\") pod \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.180055 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmx8m\" (UniqueName: \"kubernetes.io/projected/14ec883d-a110-4e19-ba6d-7435b5e76cbc-kube-api-access-pmx8m\") pod \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.180104 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-ceilometer-tls-certs\") pod \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.180132 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-config-data\") pod \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.180192 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-scripts\") pod \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.180240 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-combined-ca-bundle\") pod \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.180320 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-log-httpd\") pod \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\" (UID: \"14ec883d-a110-4e19-ba6d-7435b5e76cbc\") " Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.181055 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "14ec883d-a110-4e19-ba6d-7435b5e76cbc" (UID: "14ec883d-a110-4e19-ba6d-7435b5e76cbc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.183784 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "14ec883d-a110-4e19-ba6d-7435b5e76cbc" (UID: "14ec883d-a110-4e19-ba6d-7435b5e76cbc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.187978 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ec883d-a110-4e19-ba6d-7435b5e76cbc-kube-api-access-pmx8m" (OuterVolumeSpecName: "kube-api-access-pmx8m") pod "14ec883d-a110-4e19-ba6d-7435b5e76cbc" (UID: "14ec883d-a110-4e19-ba6d-7435b5e76cbc"). InnerVolumeSpecName "kube-api-access-pmx8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.196692 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-scripts" (OuterVolumeSpecName: "scripts") pod "14ec883d-a110-4e19-ba6d-7435b5e76cbc" (UID: "14ec883d-a110-4e19-ba6d-7435b5e76cbc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.227917 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "14ec883d-a110-4e19-ba6d-7435b5e76cbc" (UID: "14ec883d-a110-4e19-ba6d-7435b5e76cbc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.254471 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "14ec883d-a110-4e19-ba6d-7435b5e76cbc" (UID: "14ec883d-a110-4e19-ba6d-7435b5e76cbc"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.265207 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14ec883d-a110-4e19-ba6d-7435b5e76cbc" (UID: "14ec883d-a110-4e19-ba6d-7435b5e76cbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.282803 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.282844 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.282858 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.282870 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.282880 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14ec883d-a110-4e19-ba6d-7435b5e76cbc-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.282890 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmx8m\" (UniqueName: \"kubernetes.io/projected/14ec883d-a110-4e19-ba6d-7435b5e76cbc-kube-api-access-pmx8m\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.282901 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.322953 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-config-data" (OuterVolumeSpecName: "config-data") pod "14ec883d-a110-4e19-ba6d-7435b5e76cbc" (UID: "14ec883d-a110-4e19-ba6d-7435b5e76cbc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.384602 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ec883d-a110-4e19-ba6d-7435b5e76cbc-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.856595 4835 generic.go:334] "Generic (PLEG): container finished" podID="af74a3af-19d5-45bf-b366-5e79fe901079" containerID="c1d1c1bc0be760576a311187ca4f80eeec19da3191c231e3a0a2983e09e1f0ea" exitCode=0 Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.856665 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rk9m4" event={"ID":"af74a3af-19d5-45bf-b366-5e79fe901079","Type":"ContainerDied","Data":"c1d1c1bc0be760576a311187ca4f80eeec19da3191c231e3a0a2983e09e1f0ea"} Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.860383 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14ec883d-a110-4e19-ba6d-7435b5e76cbc","Type":"ContainerDied","Data":"f0e0776bade9db3fb3ff46477f6c9f9d0dad5712a232d70b03be3a34a576e2e2"} Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.860433 4835 scope.go:117] "RemoveContainer" containerID="b791aabfc9ea9db8db624714dc3c71430bc6c7614233fa40ae4bfd6c2ac63e4c" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.860506 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.895710 4835 scope.go:117] "RemoveContainer" containerID="ffb015ca858d0b3adb9b5ca9dbc2bd9e99ee48665270195bc3eff0d1db826f6f" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.902849 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.913434 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.929522 4835 scope.go:117] "RemoveContainer" containerID="d3a9b5b1ab09c6e811c0aaec9bc9f5da1191ea52cab6dc1f1285910cc39a5fbd" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.932937 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:03 crc kubenswrapper[4835]: E1002 11:16:03.933402 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813d1ab2-14f3-4726-9b9a-3fe1be2a1395" containerName="barbican-api-log" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933422 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="813d1ab2-14f3-4726-9b9a-3fe1be2a1395" containerName="barbican-api-log" Oct 02 11:16:03 crc kubenswrapper[4835]: E1002 11:16:03.933446 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="ceilometer-central-agent" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933453 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="ceilometer-central-agent" Oct 02 11:16:03 crc kubenswrapper[4835]: E1002 11:16:03.933467 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="sg-core" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933473 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="sg-core" Oct 02 11:16:03 crc kubenswrapper[4835]: E1002 11:16:03.933481 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="proxy-httpd" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933487 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="proxy-httpd" Oct 02 11:16:03 crc kubenswrapper[4835]: E1002 11:16:03.933508 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="ceilometer-notification-agent" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933514 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="ceilometer-notification-agent" Oct 02 11:16:03 crc kubenswrapper[4835]: E1002 11:16:03.933534 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813d1ab2-14f3-4726-9b9a-3fe1be2a1395" containerName="barbican-api" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933542 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="813d1ab2-14f3-4726-9b9a-3fe1be2a1395" containerName="barbican-api" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933702 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="sg-core" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933719 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="ceilometer-notification-agent" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933729 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="ceilometer-central-agent" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933738 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" containerName="proxy-httpd" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933747 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="813d1ab2-14f3-4726-9b9a-3fe1be2a1395" containerName="barbican-api-log" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.933755 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="813d1ab2-14f3-4726-9b9a-3fe1be2a1395" containerName="barbican-api" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.937369 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.944820 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.944844 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.945866 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.951945 4835 scope.go:117] "RemoveContainer" containerID="f7a09d1b2320770b3ca6bb8c61bfa4ad16222f35ccb9ee77db469b6f6d0a6f12" Oct 02 11:16:03 crc kubenswrapper[4835]: I1002 11:16:03.958822 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.139424 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-config-data\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.139469 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-run-httpd\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.139510 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-log-httpd\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.139547 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.139570 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-scripts\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.139590 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.139612 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.139643 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb9tk\" (UniqueName: \"kubernetes.io/projected/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-kube-api-access-rb9tk\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.241005 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-config-data\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.241066 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-run-httpd\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.241113 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-log-httpd\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.241162 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.241185 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-scripts\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.241203 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.241299 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.241351 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb9tk\" (UniqueName: \"kubernetes.io/projected/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-kube-api-access-rb9tk\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.242514 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-log-httpd\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.242527 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-run-httpd\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.247292 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-scripts\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.247715 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-config-data\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.248577 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.249151 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.249733 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.262811 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb9tk\" (UniqueName: \"kubernetes.io/projected/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-kube-api-access-rb9tk\") pod \"ceilometer-0\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " pod="openstack/ceilometer-0" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.264797 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ec883d-a110-4e19-ba6d-7435b5e76cbc" path="/var/lib/kubelet/pods/14ec883d-a110-4e19-ba6d-7435b5e76cbc/volumes" Oct 02 11:16:04 crc kubenswrapper[4835]: I1002 11:16:04.555691 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.014924 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.309975 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.467553 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-config\") pod \"af74a3af-19d5-45bf-b366-5e79fe901079\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.467642 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-combined-ca-bundle\") pod \"af74a3af-19d5-45bf-b366-5e79fe901079\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.467691 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnrrb\" (UniqueName: \"kubernetes.io/projected/af74a3af-19d5-45bf-b366-5e79fe901079-kube-api-access-qnrrb\") pod \"af74a3af-19d5-45bf-b366-5e79fe901079\" (UID: \"af74a3af-19d5-45bf-b366-5e79fe901079\") " Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.474455 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af74a3af-19d5-45bf-b366-5e79fe901079-kube-api-access-qnrrb" (OuterVolumeSpecName: "kube-api-access-qnrrb") pod "af74a3af-19d5-45bf-b366-5e79fe901079" (UID: "af74a3af-19d5-45bf-b366-5e79fe901079"). InnerVolumeSpecName "kube-api-access-qnrrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.504580 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af74a3af-19d5-45bf-b366-5e79fe901079" (UID: "af74a3af-19d5-45bf-b366-5e79fe901079"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.518497 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-config" (OuterVolumeSpecName: "config") pod "af74a3af-19d5-45bf-b366-5e79fe901079" (UID: "af74a3af-19d5-45bf-b366-5e79fe901079"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.569834 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.569878 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af74a3af-19d5-45bf-b366-5e79fe901079-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.569889 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnrrb\" (UniqueName: \"kubernetes.io/projected/af74a3af-19d5-45bf-b366-5e79fe901079-kube-api-access-qnrrb\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.941539 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rk9m4" event={"ID":"af74a3af-19d5-45bf-b366-5e79fe901079","Type":"ContainerDied","Data":"593980dde5525de1afd38338dfba9799b7cf1182bb75e45209ffd4d669e20f83"} Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.941857 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="593980dde5525de1afd38338dfba9799b7cf1182bb75e45209ffd4d669e20f83" Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.941935 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rk9m4" Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.948898 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c","Type":"ContainerStarted","Data":"ec261a8a67c6ea22442460cefcb32c24ca9aec0b8362191853907cdf23621222"} Oct 02 11:16:05 crc kubenswrapper[4835]: I1002 11:16:05.948952 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c","Type":"ContainerStarted","Data":"5dfd16d8040296ce5079f9d1f2f6e13c9a5bf256677cb8eec9eb750bf07c302c"} Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.136500 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-9tspq"] Oct 02 11:16:06 crc kubenswrapper[4835]: E1002 11:16:06.136999 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af74a3af-19d5-45bf-b366-5e79fe901079" containerName="neutron-db-sync" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.137015 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="af74a3af-19d5-45bf-b366-5e79fe901079" containerName="neutron-db-sync" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.137194 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="af74a3af-19d5-45bf-b366-5e79fe901079" containerName="neutron-db-sync" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.140594 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.175299 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-9tspq"] Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.275748 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76cd984794-hg967"] Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.277286 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.283680 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcdb2\" (UniqueName: \"kubernetes.io/projected/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-kube-api-access-gcdb2\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.283745 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-config\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.283812 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.283843 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.283873 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.284864 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.285101 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.285701 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2nd9j" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.285847 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.289342 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76cd984794-hg967"] Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.385451 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.385514 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.385563 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-config\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.385641 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.385834 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-httpd-config\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.385937 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-ovndb-tls-certs\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.386026 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcdb2\" (UniqueName: \"kubernetes.io/projected/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-kube-api-access-gcdb2\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.386060 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-config\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.386134 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzd2n\" (UniqueName: \"kubernetes.io/projected/e02806da-6104-48e3-8d94-db8475e53b68-kube-api-access-bzd2n\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.386174 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-combined-ca-bundle\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.386795 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.387147 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.387702 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.390067 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-config\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.411253 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcdb2\" (UniqueName: \"kubernetes.io/projected/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-kube-api-access-gcdb2\") pod \"dnsmasq-dns-6d97fcdd8f-9tspq\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.480174 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.487761 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-config\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.487844 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-httpd-config\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.487909 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-ovndb-tls-certs\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.487942 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzd2n\" (UniqueName: \"kubernetes.io/projected/e02806da-6104-48e3-8d94-db8475e53b68-kube-api-access-bzd2n\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.487963 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-combined-ca-bundle\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.492895 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-combined-ca-bundle\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.494087 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-ovndb-tls-certs\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.498892 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-httpd-config\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.500140 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-config\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.516055 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzd2n\" (UniqueName: \"kubernetes.io/projected/e02806da-6104-48e3-8d94-db8475e53b68-kube-api-access-bzd2n\") pod \"neutron-76cd984794-hg967\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.605075 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:06 crc kubenswrapper[4835]: I1002 11:16:06.967253 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c","Type":"ContainerStarted","Data":"6d78180a4b752862237b6a7bb0803ed51926d06d4183334e6917eb75ec371247"} Oct 02 11:16:07 crc kubenswrapper[4835]: I1002 11:16:07.077609 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-9tspq"] Oct 02 11:16:07 crc kubenswrapper[4835]: I1002 11:16:07.252352 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:07 crc kubenswrapper[4835]: I1002 11:16:07.981712 4835 generic.go:334] "Generic (PLEG): container finished" podID="c0074a72-5ca4-452b-8c69-f8ad99e14a0f" containerID="28f4c47a410fca20364d42e73a3e576420f90c71d9d5dd811a18a67820f3ac32" exitCode=0 Oct 02 11:16:07 crc kubenswrapper[4835]: I1002 11:16:07.981783 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" event={"ID":"c0074a72-5ca4-452b-8c69-f8ad99e14a0f","Type":"ContainerDied","Data":"28f4c47a410fca20364d42e73a3e576420f90c71d9d5dd811a18a67820f3ac32"} Oct 02 11:16:07 crc kubenswrapper[4835]: I1002 11:16:07.982039 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" event={"ID":"c0074a72-5ca4-452b-8c69-f8ad99e14a0f","Type":"ContainerStarted","Data":"5bb94e4c6f83eb31527fa62b4eba4d8c7a381b53ca0485935f77273ca8d192aa"} Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.194251 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76cd984794-hg967"] Oct 02 11:16:08 crc kubenswrapper[4835]: W1002 11:16:08.198478 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode02806da_6104_48e3_8d94_db8475e53b68.slice/crio-4328d3a8c34794fb2d6b8db0fe5f7a4643fcd1f3355f34fbb8d98c9f351c5d22 WatchSource:0}: Error finding container 4328d3a8c34794fb2d6b8db0fe5f7a4643fcd1f3355f34fbb8d98c9f351c5d22: Status 404 returned error can't find the container with id 4328d3a8c34794fb2d6b8db0fe5f7a4643fcd1f3355f34fbb8d98c9f351c5d22 Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.775365 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d568cc985-z84bp"] Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.777563 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.795631 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d568cc985-z84bp"] Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.799800 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.800379 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.871015 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-combined-ca-bundle\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.871350 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8t4z\" (UniqueName: \"kubernetes.io/projected/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-kube-api-access-l8t4z\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.871446 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-internal-tls-certs\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.871559 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-ovndb-tls-certs\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.871676 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-httpd-config\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.871815 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-config\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.871952 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-public-tls-certs\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.973544 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-combined-ca-bundle\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.973997 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8t4z\" (UniqueName: \"kubernetes.io/projected/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-kube-api-access-l8t4z\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.974018 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-internal-tls-certs\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.974046 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-ovndb-tls-certs\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.974071 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-httpd-config\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.974117 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-config\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.974149 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-public-tls-certs\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.980297 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-httpd-config\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.980446 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-ovndb-tls-certs\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.981373 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-internal-tls-certs\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.993080 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-public-tls-certs\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.994386 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-combined-ca-bundle\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.994767 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-config\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:08 crc kubenswrapper[4835]: I1002 11:16:08.996997 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c","Type":"ContainerStarted","Data":"2f6a11492420e0e9c565233bbc53a80aeb16318f99d64cd365024243e2bc0829"} Oct 02 11:16:09 crc kubenswrapper[4835]: I1002 11:16:08.999489 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cd984794-hg967" event={"ID":"e02806da-6104-48e3-8d94-db8475e53b68","Type":"ContainerStarted","Data":"6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582"} Oct 02 11:16:09 crc kubenswrapper[4835]: I1002 11:16:08.999541 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cd984794-hg967" event={"ID":"e02806da-6104-48e3-8d94-db8475e53b68","Type":"ContainerStarted","Data":"4328d3a8c34794fb2d6b8db0fe5f7a4643fcd1f3355f34fbb8d98c9f351c5d22"} Oct 02 11:16:09 crc kubenswrapper[4835]: I1002 11:16:09.001812 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" event={"ID":"c0074a72-5ca4-452b-8c69-f8ad99e14a0f","Type":"ContainerStarted","Data":"b3754b5f33119b5d94162cdee22794bc3130c3996f67195a78f0dfc70fa52cb4"} Oct 02 11:16:09 crc kubenswrapper[4835]: I1002 11:16:09.002419 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:09 crc kubenswrapper[4835]: I1002 11:16:09.003261 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8t4z\" (UniqueName: \"kubernetes.io/projected/e08e86a7-2ef3-48ad-82ab-cffc2007fd24-kube-api-access-l8t4z\") pod \"neutron-6d568cc985-z84bp\" (UID: \"e08e86a7-2ef3-48ad-82ab-cffc2007fd24\") " pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:09 crc kubenswrapper[4835]: I1002 11:16:09.037424 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" podStartSLOduration=3.037398709 podStartE2EDuration="3.037398709s" podCreationTimestamp="2025-10-02 11:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:09.028971965 +0000 UTC m=+1245.588879566" watchObservedRunningTime="2025-10-02 11:16:09.037398709 +0000 UTC m=+1245.597306290" Oct 02 11:16:09 crc kubenswrapper[4835]: I1002 11:16:09.187029 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:09 crc kubenswrapper[4835]: I1002 11:16:09.822743 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d568cc985-z84bp"] Oct 02 11:16:09 crc kubenswrapper[4835]: W1002 11:16:09.835424 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode08e86a7_2ef3_48ad_82ab_cffc2007fd24.slice/crio-8cb1335b9675675dc83a7ec9e534e7ab75c0a6b5281310bc2e46fec68e48f493 WatchSource:0}: Error finding container 8cb1335b9675675dc83a7ec9e534e7ab75c0a6b5281310bc2e46fec68e48f493: Status 404 returned error can't find the container with id 8cb1335b9675675dc83a7ec9e534e7ab75c0a6b5281310bc2e46fec68e48f493 Oct 02 11:16:10 crc kubenswrapper[4835]: I1002 11:16:10.084550 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cd984794-hg967" event={"ID":"e02806da-6104-48e3-8d94-db8475e53b68","Type":"ContainerStarted","Data":"89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a"} Oct 02 11:16:10 crc kubenswrapper[4835]: I1002 11:16:10.086639 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:10 crc kubenswrapper[4835]: I1002 11:16:10.107746 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d568cc985-z84bp" event={"ID":"e08e86a7-2ef3-48ad-82ab-cffc2007fd24","Type":"ContainerStarted","Data":"8cb1335b9675675dc83a7ec9e534e7ab75c0a6b5281310bc2e46fec68e48f493"} Oct 02 11:16:10 crc kubenswrapper[4835]: I1002 11:16:10.123015 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76cd984794-hg967" podStartSLOduration=4.122987961 podStartE2EDuration="4.122987961s" podCreationTimestamp="2025-10-02 11:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:10.118754689 +0000 UTC m=+1246.678662280" watchObservedRunningTime="2025-10-02 11:16:10.122987961 +0000 UTC m=+1246.682895552" Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.131502 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c","Type":"ContainerStarted","Data":"4a0450bc99df670b8604d536f594773cb2132b0f51beac27c88aad3374783565"} Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.131613 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="ceilometer-central-agent" containerID="cri-o://ec261a8a67c6ea22442460cefcb32c24ca9aec0b8362191853907cdf23621222" gracePeriod=30 Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.131740 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="sg-core" containerID="cri-o://2f6a11492420e0e9c565233bbc53a80aeb16318f99d64cd365024243e2bc0829" gracePeriod=30 Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.131791 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="proxy-httpd" containerID="cri-o://4a0450bc99df670b8604d536f594773cb2132b0f51beac27c88aad3374783565" gracePeriod=30 Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.131788 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="ceilometer-notification-agent" containerID="cri-o://6d78180a4b752862237b6a7bb0803ed51926d06d4183334e6917eb75ec371247" gracePeriod=30 Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.132029 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.138622 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d568cc985-z84bp" event={"ID":"e08e86a7-2ef3-48ad-82ab-cffc2007fd24","Type":"ContainerStarted","Data":"b6e86b3419c83c164571a8010abf669cacddbdefe61f790262dfc130f84b1d7c"} Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.138685 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d568cc985-z84bp" event={"ID":"e08e86a7-2ef3-48ad-82ab-cffc2007fd24","Type":"ContainerStarted","Data":"0635df58119a550625220e7ac353355864c055516824347696ac392a35d71e1b"} Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.138911 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.164984 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.214798592 podStartE2EDuration="8.164956964s" podCreationTimestamp="2025-10-02 11:16:03 +0000 UTC" firstStartedPulling="2025-10-02 11:16:05.030956782 +0000 UTC m=+1241.590864363" lastFinishedPulling="2025-10-02 11:16:09.981115154 +0000 UTC m=+1246.541022735" observedRunningTime="2025-10-02 11:16:11.156015756 +0000 UTC m=+1247.715923347" watchObservedRunningTime="2025-10-02 11:16:11.164956964 +0000 UTC m=+1247.724864545" Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.190986 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d568cc985-z84bp" podStartSLOduration=3.190962975 podStartE2EDuration="3.190962975s" podCreationTimestamp="2025-10-02 11:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:11.182627115 +0000 UTC m=+1247.742534716" watchObservedRunningTime="2025-10-02 11:16:11.190962975 +0000 UTC m=+1247.750870556" Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.984032 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:16:11 crc kubenswrapper[4835]: I1002 11:16:11.984404 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.152469 4835 generic.go:334] "Generic (PLEG): container finished" podID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerID="4a0450bc99df670b8604d536f594773cb2132b0f51beac27c88aad3374783565" exitCode=0 Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.152512 4835 generic.go:334] "Generic (PLEG): container finished" podID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerID="2f6a11492420e0e9c565233bbc53a80aeb16318f99d64cd365024243e2bc0829" exitCode=2 Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.152522 4835 generic.go:334] "Generic (PLEG): container finished" podID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerID="6d78180a4b752862237b6a7bb0803ed51926d06d4183334e6917eb75ec371247" exitCode=0 Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.152531 4835 generic.go:334] "Generic (PLEG): container finished" podID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerID="ec261a8a67c6ea22442460cefcb32c24ca9aec0b8362191853907cdf23621222" exitCode=0 Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.152553 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c","Type":"ContainerDied","Data":"4a0450bc99df670b8604d536f594773cb2132b0f51beac27c88aad3374783565"} Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.152617 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c","Type":"ContainerDied","Data":"2f6a11492420e0e9c565233bbc53a80aeb16318f99d64cd365024243e2bc0829"} Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.152630 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c","Type":"ContainerDied","Data":"6d78180a4b752862237b6a7bb0803ed51926d06d4183334e6917eb75ec371247"} Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.152642 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c","Type":"ContainerDied","Data":"ec261a8a67c6ea22442460cefcb32c24ca9aec0b8362191853907cdf23621222"} Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.292127 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.447259 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-sg-core-conf-yaml\") pod \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.447299 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-combined-ca-bundle\") pod \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.447346 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-run-httpd\") pod \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.447427 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-log-httpd\") pod \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.447501 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-ceilometer-tls-certs\") pod \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.447532 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-config-data\") pod \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.447562 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-scripts\") pod \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.447589 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb9tk\" (UniqueName: \"kubernetes.io/projected/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-kube-api-access-rb9tk\") pod \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\" (UID: \"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c\") " Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.448616 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" (UID: "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.448734 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" (UID: "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.455397 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-kube-api-access-rb9tk" (OuterVolumeSpecName: "kube-api-access-rb9tk") pod "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" (UID: "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c"). InnerVolumeSpecName "kube-api-access-rb9tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.456423 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-scripts" (OuterVolumeSpecName: "scripts") pod "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" (UID: "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.502043 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" (UID: "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.532510 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" (UID: "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.550274 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.550313 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb9tk\" (UniqueName: \"kubernetes.io/projected/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-kube-api-access-rb9tk\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.550325 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.550334 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.550343 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.550350 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.568588 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" (UID: "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.570655 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-config-data" (OuterVolumeSpecName: "config-data") pod "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" (UID: "d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.652712 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:12 crc kubenswrapper[4835]: I1002 11:16:12.652760 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.164405 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c","Type":"ContainerDied","Data":"5dfd16d8040296ce5079f9d1f2f6e13c9a5bf256677cb8eec9eb750bf07c302c"} Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.164467 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.164476 4835 scope.go:117] "RemoveContainer" containerID="4a0450bc99df670b8604d536f594773cb2132b0f51beac27c88aad3374783565" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.204028 4835 scope.go:117] "RemoveContainer" containerID="2f6a11492420e0e9c565233bbc53a80aeb16318f99d64cd365024243e2bc0829" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.208173 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.216760 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.232289 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:13 crc kubenswrapper[4835]: E1002 11:16:13.232671 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="proxy-httpd" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.232687 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="proxy-httpd" Oct 02 11:16:13 crc kubenswrapper[4835]: E1002 11:16:13.232717 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="sg-core" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.232723 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="sg-core" Oct 02 11:16:13 crc kubenswrapper[4835]: E1002 11:16:13.232736 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="ceilometer-central-agent" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.232742 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="ceilometer-central-agent" Oct 02 11:16:13 crc kubenswrapper[4835]: E1002 11:16:13.232756 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="ceilometer-notification-agent" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.232762 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="ceilometer-notification-agent" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.232936 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="ceilometer-central-agent" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.232951 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="sg-core" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.232962 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="ceilometer-notification-agent" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.232974 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" containerName="proxy-httpd" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.233596 4835 scope.go:117] "RemoveContainer" containerID="6d78180a4b752862237b6a7bb0803ed51926d06d4183334e6917eb75ec371247" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.234889 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.238146 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.238560 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.242103 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.250325 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.277483 4835 scope.go:117] "RemoveContainer" containerID="ec261a8a67c6ea22442460cefcb32c24ca9aec0b8362191853907cdf23621222" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.365683 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.366364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.366413 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.366452 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-log-httpd\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.366600 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-config-data\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.366929 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmt8j\" (UniqueName: \"kubernetes.io/projected/be7ab938-6666-4687-aa3b-61b4da0d7ac6-kube-api-access-fmt8j\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.367089 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-scripts\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.367318 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-run-httpd\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.469457 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-run-httpd\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.469982 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.470415 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-run-httpd\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.471061 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.471252 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.471393 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-log-httpd\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.471700 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-log-httpd\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.471967 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-config-data\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.472294 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmt8j\" (UniqueName: \"kubernetes.io/projected/be7ab938-6666-4687-aa3b-61b4da0d7ac6-kube-api-access-fmt8j\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.472461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-scripts\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.476626 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.477812 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-config-data\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.478203 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.478605 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-scripts\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.486094 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.495365 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmt8j\" (UniqueName: \"kubernetes.io/projected/be7ab938-6666-4687-aa3b-61b4da0d7ac6-kube-api-access-fmt8j\") pod \"ceilometer-0\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.554721 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:13 crc kubenswrapper[4835]: I1002 11:16:13.842885 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:14 crc kubenswrapper[4835]: I1002 11:16:14.068608 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:14 crc kubenswrapper[4835]: I1002 11:16:14.174484 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be7ab938-6666-4687-aa3b-61b4da0d7ac6","Type":"ContainerStarted","Data":"d967b76a8d1885de12089108f1c9b240207412de03f9482876947cf86f5ddd2d"} Oct 02 11:16:14 crc kubenswrapper[4835]: I1002 11:16:14.264238 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c" path="/var/lib/kubelet/pods/d2cb0aa5-d15e-4ca0-81ff-54fcf44d528c/volumes" Oct 02 11:16:15 crc kubenswrapper[4835]: I1002 11:16:15.192442 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be7ab938-6666-4687-aa3b-61b4da0d7ac6","Type":"ContainerStarted","Data":"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6"} Oct 02 11:16:16 crc kubenswrapper[4835]: I1002 11:16:16.204126 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be7ab938-6666-4687-aa3b-61b4da0d7ac6","Type":"ContainerStarted","Data":"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1"} Oct 02 11:16:16 crc kubenswrapper[4835]: I1002 11:16:16.482133 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:16 crc kubenswrapper[4835]: I1002 11:16:16.551057 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b76cdf485-qt4lf"] Oct 02 11:16:16 crc kubenswrapper[4835]: I1002 11:16:16.552355 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" podUID="9c187566-73e0-40aa-a7db-a0cde5431b83" containerName="dnsmasq-dns" containerID="cri-o://8ef9aa002581037af6725b688d3bc63cb018749d0d9c7edbcc233bf191cf618a" gracePeriod=10 Oct 02 11:16:17 crc kubenswrapper[4835]: I1002 11:16:17.222548 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be7ab938-6666-4687-aa3b-61b4da0d7ac6","Type":"ContainerStarted","Data":"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0"} Oct 02 11:16:17 crc kubenswrapper[4835]: I1002 11:16:17.224771 4835 generic.go:334] "Generic (PLEG): container finished" podID="ff2c9e49-6700-488d-bbcd-46812b7bf134" containerID="acc301e91bc70fe750761770f498da7d676fe40a139bbfad72da3abe10357c37" exitCode=0 Oct 02 11:16:17 crc kubenswrapper[4835]: I1002 11:16:17.224800 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7qhlf" event={"ID":"ff2c9e49-6700-488d-bbcd-46812b7bf134","Type":"ContainerDied","Data":"acc301e91bc70fe750761770f498da7d676fe40a139bbfad72da3abe10357c37"} Oct 02 11:16:18 crc kubenswrapper[4835]: I1002 11:16:18.235110 4835 generic.go:334] "Generic (PLEG): container finished" podID="9c187566-73e0-40aa-a7db-a0cde5431b83" containerID="8ef9aa002581037af6725b688d3bc63cb018749d0d9c7edbcc233bf191cf618a" exitCode=0 Oct 02 11:16:18 crc kubenswrapper[4835]: I1002 11:16:18.235172 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" event={"ID":"9c187566-73e0-40aa-a7db-a0cde5431b83","Type":"ContainerDied","Data":"8ef9aa002581037af6725b688d3bc63cb018749d0d9c7edbcc233bf191cf618a"} Oct 02 11:16:18 crc kubenswrapper[4835]: I1002 11:16:18.741620 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:16:18 crc kubenswrapper[4835]: I1002 11:16:18.940655 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74fst\" (UniqueName: \"kubernetes.io/projected/ff2c9e49-6700-488d-bbcd-46812b7bf134-kube-api-access-74fst\") pod \"ff2c9e49-6700-488d-bbcd-46812b7bf134\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " Oct 02 11:16:18 crc kubenswrapper[4835]: I1002 11:16:18.940765 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-config-data\") pod \"ff2c9e49-6700-488d-bbcd-46812b7bf134\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " Oct 02 11:16:18 crc kubenswrapper[4835]: I1002 11:16:18.940838 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-scripts\") pod \"ff2c9e49-6700-488d-bbcd-46812b7bf134\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " Oct 02 11:16:18 crc kubenswrapper[4835]: I1002 11:16:18.940875 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-combined-ca-bundle\") pod \"ff2c9e49-6700-488d-bbcd-46812b7bf134\" (UID: \"ff2c9e49-6700-488d-bbcd-46812b7bf134\") " Oct 02 11:16:18 crc kubenswrapper[4835]: I1002 11:16:18.947941 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-scripts" (OuterVolumeSpecName: "scripts") pod "ff2c9e49-6700-488d-bbcd-46812b7bf134" (UID: "ff2c9e49-6700-488d-bbcd-46812b7bf134"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:18 crc kubenswrapper[4835]: I1002 11:16:18.948682 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff2c9e49-6700-488d-bbcd-46812b7bf134-kube-api-access-74fst" (OuterVolumeSpecName: "kube-api-access-74fst") pod "ff2c9e49-6700-488d-bbcd-46812b7bf134" (UID: "ff2c9e49-6700-488d-bbcd-46812b7bf134"). InnerVolumeSpecName "kube-api-access-74fst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:18 crc kubenswrapper[4835]: I1002 11:16:18.972809 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff2c9e49-6700-488d-bbcd-46812b7bf134" (UID: "ff2c9e49-6700-488d-bbcd-46812b7bf134"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:18 crc kubenswrapper[4835]: I1002 11:16:18.985429 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-config-data" (OuterVolumeSpecName: "config-data") pod "ff2c9e49-6700-488d-bbcd-46812b7bf134" (UID: "ff2c9e49-6700-488d-bbcd-46812b7bf134"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.046378 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.046415 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74fst\" (UniqueName: \"kubernetes.io/projected/ff2c9e49-6700-488d-bbcd-46812b7bf134-kube-api-access-74fst\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.046428 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.046436 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2c9e49-6700-488d-bbcd-46812b7bf134-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.254930 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" podUID="9c187566-73e0-40aa-a7db-a0cde5431b83" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: connect: connection refused" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.261344 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7qhlf" event={"ID":"ff2c9e49-6700-488d-bbcd-46812b7bf134","Type":"ContainerDied","Data":"4d40ca68c2ad41b456a6455a9a22f83dad442354c1d06ec11c438bc5c4b693f0"} Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.261406 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d40ca68c2ad41b456a6455a9a22f83dad442354c1d06ec11c438bc5c4b693f0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.261446 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7qhlf" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.339358 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:19 crc kubenswrapper[4835]: E1002 11:16:19.339754 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2c9e49-6700-488d-bbcd-46812b7bf134" containerName="nova-cell0-conductor-db-sync" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.339774 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2c9e49-6700-488d-bbcd-46812b7bf134" containerName="nova-cell0-conductor-db-sync" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.339979 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff2c9e49-6700-488d-bbcd-46812b7bf134" containerName="nova-cell0-conductor-db-sync" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.340663 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.342665 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ltbl8" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.343293 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.355349 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.453242 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.455394 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.455467 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdsw2\" (UniqueName: \"kubernetes.io/projected/7e339cea-b064-4e8b-a147-af1337175248-kube-api-access-zdsw2\") pod \"nova-cell0-conductor-0\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.565617 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.565732 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.565775 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdsw2\" (UniqueName: \"kubernetes.io/projected/7e339cea-b064-4e8b-a147-af1337175248-kube-api-access-zdsw2\") pod \"nova-cell0-conductor-0\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.570470 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.570543 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.596520 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdsw2\" (UniqueName: \"kubernetes.io/projected/7e339cea-b064-4e8b-a147-af1337175248-kube-api-access-zdsw2\") pod \"nova-cell0-conductor-0\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.659629 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.803621 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.871858 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-nb\") pod \"9c187566-73e0-40aa-a7db-a0cde5431b83\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.871997 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-sb\") pod \"9c187566-73e0-40aa-a7db-a0cde5431b83\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.872037 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-dns-svc\") pod \"9c187566-73e0-40aa-a7db-a0cde5431b83\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.872094 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-config\") pod \"9c187566-73e0-40aa-a7db-a0cde5431b83\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.872152 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl7x7\" (UniqueName: \"kubernetes.io/projected/9c187566-73e0-40aa-a7db-a0cde5431b83-kube-api-access-vl7x7\") pod \"9c187566-73e0-40aa-a7db-a0cde5431b83\" (UID: \"9c187566-73e0-40aa-a7db-a0cde5431b83\") " Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.883349 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c187566-73e0-40aa-a7db-a0cde5431b83-kube-api-access-vl7x7" (OuterVolumeSpecName: "kube-api-access-vl7x7") pod "9c187566-73e0-40aa-a7db-a0cde5431b83" (UID: "9c187566-73e0-40aa-a7db-a0cde5431b83"). InnerVolumeSpecName "kube-api-access-vl7x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.939346 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c187566-73e0-40aa-a7db-a0cde5431b83" (UID: "9c187566-73e0-40aa-a7db-a0cde5431b83"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.952358 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9c187566-73e0-40aa-a7db-a0cde5431b83" (UID: "9c187566-73e0-40aa-a7db-a0cde5431b83"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.960523 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-config" (OuterVolumeSpecName: "config") pod "9c187566-73e0-40aa-a7db-a0cde5431b83" (UID: "9c187566-73e0-40aa-a7db-a0cde5431b83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.964784 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c187566-73e0-40aa-a7db-a0cde5431b83" (UID: "9c187566-73e0-40aa-a7db-a0cde5431b83"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.974513 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.974548 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.974561 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.974569 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c187566-73e0-40aa-a7db-a0cde5431b83-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:19 crc kubenswrapper[4835]: I1002 11:16:19.974578 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl7x7\" (UniqueName: \"kubernetes.io/projected/9c187566-73e0-40aa-a7db-a0cde5431b83-kube-api-access-vl7x7\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:20 crc kubenswrapper[4835]: I1002 11:16:20.176343 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:20 crc kubenswrapper[4835]: I1002 11:16:20.283321 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7e339cea-b064-4e8b-a147-af1337175248","Type":"ContainerStarted","Data":"020874ee770c1128a973b7b4f7c05b37e18547b6c50a0393722d3e14ae6754d8"} Oct 02 11:16:20 crc kubenswrapper[4835]: I1002 11:16:20.290406 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" event={"ID":"9c187566-73e0-40aa-a7db-a0cde5431b83","Type":"ContainerDied","Data":"e204b616555b316e8b49a0d45241971461c04141498dd85b0cf3e8405009c4cf"} Oct 02 11:16:20 crc kubenswrapper[4835]: I1002 11:16:20.290485 4835 scope.go:117] "RemoveContainer" containerID="8ef9aa002581037af6725b688d3bc63cb018749d0d9c7edbcc233bf191cf618a" Oct 02 11:16:20 crc kubenswrapper[4835]: I1002 11:16:20.290482 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b76cdf485-qt4lf" Oct 02 11:16:20 crc kubenswrapper[4835]: I1002 11:16:20.321560 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b76cdf485-qt4lf"] Oct 02 11:16:20 crc kubenswrapper[4835]: I1002 11:16:20.321847 4835 scope.go:117] "RemoveContainer" containerID="7b371cec8c743bde02b3a84e6736dc65f9b7693beaccf5bc0c8dce9886bf92a0" Oct 02 11:16:20 crc kubenswrapper[4835]: I1002 11:16:20.331978 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b76cdf485-qt4lf"] Oct 02 11:16:21 crc kubenswrapper[4835]: I1002 11:16:21.306117 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:21 crc kubenswrapper[4835]: I1002 11:16:21.306287 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7e339cea-b064-4e8b-a147-af1337175248","Type":"ContainerStarted","Data":"29118060e1546bf778268e7a613b73fc51dcaed84f147ec712487e9ba2b25dfb"} Oct 02 11:16:21 crc kubenswrapper[4835]: I1002 11:16:21.333792 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.333757284 podStartE2EDuration="2.333757284s" podCreationTimestamp="2025-10-02 11:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:21.330324085 +0000 UTC m=+1257.890231686" watchObservedRunningTime="2025-10-02 11:16:21.333757284 +0000 UTC m=+1257.893664865" Oct 02 11:16:22 crc kubenswrapper[4835]: I1002 11:16:22.265795 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c187566-73e0-40aa-a7db-a0cde5431b83" path="/var/lib/kubelet/pods/9c187566-73e0-40aa-a7db-a0cde5431b83/volumes" Oct 02 11:16:23 crc kubenswrapper[4835]: I1002 11:16:23.326578 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be7ab938-6666-4687-aa3b-61b4da0d7ac6","Type":"ContainerStarted","Data":"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37"} Oct 02 11:16:23 crc kubenswrapper[4835]: I1002 11:16:23.326838 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:16:23 crc kubenswrapper[4835]: I1002 11:16:23.326866 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="proxy-httpd" containerID="cri-o://10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37" gracePeriod=30 Oct 02 11:16:23 crc kubenswrapper[4835]: I1002 11:16:23.326887 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="sg-core" containerID="cri-o://86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0" gracePeriod=30 Oct 02 11:16:23 crc kubenswrapper[4835]: I1002 11:16:23.327073 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="ceilometer-notification-agent" containerID="cri-o://91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1" gracePeriod=30 Oct 02 11:16:23 crc kubenswrapper[4835]: I1002 11:16:23.327210 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="ceilometer-central-agent" containerID="cri-o://a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6" gracePeriod=30 Oct 02 11:16:23 crc kubenswrapper[4835]: I1002 11:16:23.358872 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.331603028 podStartE2EDuration="10.35885301s" podCreationTimestamp="2025-10-02 11:16:13 +0000 UTC" firstStartedPulling="2025-10-02 11:16:14.072461134 +0000 UTC m=+1250.632368715" lastFinishedPulling="2025-10-02 11:16:22.099711126 +0000 UTC m=+1258.659618697" observedRunningTime="2025-10-02 11:16:23.355652397 +0000 UTC m=+1259.915559978" watchObservedRunningTime="2025-10-02 11:16:23.35885301 +0000 UTC m=+1259.918760591" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.212754 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.339600 4835 generic.go:334] "Generic (PLEG): container finished" podID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerID="10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37" exitCode=0 Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.339637 4835 generic.go:334] "Generic (PLEG): container finished" podID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerID="86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0" exitCode=2 Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.339645 4835 generic.go:334] "Generic (PLEG): container finished" podID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerID="91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1" exitCode=0 Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.339653 4835 generic.go:334] "Generic (PLEG): container finished" podID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerID="a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6" exitCode=0 Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.339674 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be7ab938-6666-4687-aa3b-61b4da0d7ac6","Type":"ContainerDied","Data":"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37"} Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.339703 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be7ab938-6666-4687-aa3b-61b4da0d7ac6","Type":"ContainerDied","Data":"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0"} Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.339715 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be7ab938-6666-4687-aa3b-61b4da0d7ac6","Type":"ContainerDied","Data":"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1"} Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.339724 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be7ab938-6666-4687-aa3b-61b4da0d7ac6","Type":"ContainerDied","Data":"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6"} Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.339732 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be7ab938-6666-4687-aa3b-61b4da0d7ac6","Type":"ContainerDied","Data":"d967b76a8d1885de12089108f1c9b240207412de03f9482876947cf86f5ddd2d"} Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.339752 4835 scope.go:117] "RemoveContainer" containerID="10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.339764 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.356129 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-config-data\") pod \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.356176 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-ceilometer-tls-certs\") pod \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.356282 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-log-httpd\") pod \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.356344 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-run-httpd\") pod \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.356393 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-scripts\") pod \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.356418 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-combined-ca-bundle\") pod \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.356452 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmt8j\" (UniqueName: \"kubernetes.io/projected/be7ab938-6666-4687-aa3b-61b4da0d7ac6-kube-api-access-fmt8j\") pod \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.356495 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-sg-core-conf-yaml\") pod \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\" (UID: \"be7ab938-6666-4687-aa3b-61b4da0d7ac6\") " Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.356960 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "be7ab938-6666-4687-aa3b-61b4da0d7ac6" (UID: "be7ab938-6666-4687-aa3b-61b4da0d7ac6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.357507 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.357525 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "be7ab938-6666-4687-aa3b-61b4da0d7ac6" (UID: "be7ab938-6666-4687-aa3b-61b4da0d7ac6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.363274 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be7ab938-6666-4687-aa3b-61b4da0d7ac6-kube-api-access-fmt8j" (OuterVolumeSpecName: "kube-api-access-fmt8j") pod "be7ab938-6666-4687-aa3b-61b4da0d7ac6" (UID: "be7ab938-6666-4687-aa3b-61b4da0d7ac6"). InnerVolumeSpecName "kube-api-access-fmt8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.364876 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-scripts" (OuterVolumeSpecName: "scripts") pod "be7ab938-6666-4687-aa3b-61b4da0d7ac6" (UID: "be7ab938-6666-4687-aa3b-61b4da0d7ac6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.369874 4835 scope.go:117] "RemoveContainer" containerID="86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.388411 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "be7ab938-6666-4687-aa3b-61b4da0d7ac6" (UID: "be7ab938-6666-4687-aa3b-61b4da0d7ac6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.391975 4835 scope.go:117] "RemoveContainer" containerID="91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.412721 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "be7ab938-6666-4687-aa3b-61b4da0d7ac6" (UID: "be7ab938-6666-4687-aa3b-61b4da0d7ac6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.427880 4835 scope.go:117] "RemoveContainer" containerID="a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.431416 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be7ab938-6666-4687-aa3b-61b4da0d7ac6" (UID: "be7ab938-6666-4687-aa3b-61b4da0d7ac6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.461033 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be7ab938-6666-4687-aa3b-61b4da0d7ac6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.461082 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.461091 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.461107 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmt8j\" (UniqueName: \"kubernetes.io/projected/be7ab938-6666-4687-aa3b-61b4da0d7ac6-kube-api-access-fmt8j\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.461117 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.461125 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.473349 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-config-data" (OuterVolumeSpecName: "config-data") pod "be7ab938-6666-4687-aa3b-61b4da0d7ac6" (UID: "be7ab938-6666-4687-aa3b-61b4da0d7ac6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.564197 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7ab938-6666-4687-aa3b-61b4da0d7ac6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.572964 4835 scope.go:117] "RemoveContainer" containerID="10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37" Oct 02 11:16:24 crc kubenswrapper[4835]: E1002 11:16:24.573640 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37\": container with ID starting with 10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37 not found: ID does not exist" containerID="10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.573674 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37"} err="failed to get container status \"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37\": rpc error: code = NotFound desc = could not find container \"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37\": container with ID starting with 10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.573700 4835 scope.go:117] "RemoveContainer" containerID="86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0" Oct 02 11:16:24 crc kubenswrapper[4835]: E1002 11:16:24.574621 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0\": container with ID starting with 86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0 not found: ID does not exist" containerID="86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.574692 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0"} err="failed to get container status \"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0\": rpc error: code = NotFound desc = could not find container \"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0\": container with ID starting with 86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.574732 4835 scope.go:117] "RemoveContainer" containerID="91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1" Oct 02 11:16:24 crc kubenswrapper[4835]: E1002 11:16:24.575332 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1\": container with ID starting with 91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1 not found: ID does not exist" containerID="91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.575363 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1"} err="failed to get container status \"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1\": rpc error: code = NotFound desc = could not find container \"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1\": container with ID starting with 91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.575379 4835 scope.go:117] "RemoveContainer" containerID="a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6" Oct 02 11:16:24 crc kubenswrapper[4835]: E1002 11:16:24.575860 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6\": container with ID starting with a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6 not found: ID does not exist" containerID="a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.575881 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6"} err="failed to get container status \"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6\": rpc error: code = NotFound desc = could not find container \"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6\": container with ID starting with a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.575896 4835 scope.go:117] "RemoveContainer" containerID="10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.576300 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37"} err="failed to get container status \"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37\": rpc error: code = NotFound desc = could not find container \"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37\": container with ID starting with 10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.576351 4835 scope.go:117] "RemoveContainer" containerID="86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.576781 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0"} err="failed to get container status \"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0\": rpc error: code = NotFound desc = could not find container \"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0\": container with ID starting with 86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.576805 4835 scope.go:117] "RemoveContainer" containerID="91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.577109 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1"} err="failed to get container status \"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1\": rpc error: code = NotFound desc = could not find container \"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1\": container with ID starting with 91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.577133 4835 scope.go:117] "RemoveContainer" containerID="a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.577575 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6"} err="failed to get container status \"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6\": rpc error: code = NotFound desc = could not find container \"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6\": container with ID starting with a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.577622 4835 scope.go:117] "RemoveContainer" containerID="10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.577996 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37"} err="failed to get container status \"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37\": rpc error: code = NotFound desc = could not find container \"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37\": container with ID starting with 10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.578027 4835 scope.go:117] "RemoveContainer" containerID="86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.578502 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0"} err="failed to get container status \"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0\": rpc error: code = NotFound desc = could not find container \"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0\": container with ID starting with 86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.578541 4835 scope.go:117] "RemoveContainer" containerID="91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.578853 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1"} err="failed to get container status \"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1\": rpc error: code = NotFound desc = could not find container \"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1\": container with ID starting with 91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.578897 4835 scope.go:117] "RemoveContainer" containerID="a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.579250 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6"} err="failed to get container status \"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6\": rpc error: code = NotFound desc = could not find container \"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6\": container with ID starting with a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.579273 4835 scope.go:117] "RemoveContainer" containerID="10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.579683 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37"} err="failed to get container status \"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37\": rpc error: code = NotFound desc = could not find container \"10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37\": container with ID starting with 10d03fe848349db0f3915235e20c0ce3ca24561dda0f434924af40b0b7e97f37 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.579725 4835 scope.go:117] "RemoveContainer" containerID="86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.580098 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0"} err="failed to get container status \"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0\": rpc error: code = NotFound desc = could not find container \"86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0\": container with ID starting with 86625e87b38e26187f637371deca64d21663ca0ea49a6d5dc98790605d7d8ae0 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.580125 4835 scope.go:117] "RemoveContainer" containerID="91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.580421 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1"} err="failed to get container status \"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1\": rpc error: code = NotFound desc = could not find container \"91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1\": container with ID starting with 91925617d3babe2e5c5a39592c3c736b875e9dfeb054976cfd19492349303de1 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.580447 4835 scope.go:117] "RemoveContainer" containerID="a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.580733 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6"} err="failed to get container status \"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6\": rpc error: code = NotFound desc = could not find container \"a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6\": container with ID starting with a98b8331ea237da01a1449f7e83419840e5a060cb1be6a2484c7e9c4563faca6 not found: ID does not exist" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.676713 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.684113 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.704418 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:24 crc kubenswrapper[4835]: E1002 11:16:24.704842 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="sg-core" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.704861 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="sg-core" Oct 02 11:16:24 crc kubenswrapper[4835]: E1002 11:16:24.704880 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="proxy-httpd" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.704887 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="proxy-httpd" Oct 02 11:16:24 crc kubenswrapper[4835]: E1002 11:16:24.704904 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="ceilometer-notification-agent" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.704911 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="ceilometer-notification-agent" Oct 02 11:16:24 crc kubenswrapper[4835]: E1002 11:16:24.704929 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c187566-73e0-40aa-a7db-a0cde5431b83" containerName="dnsmasq-dns" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.704934 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c187566-73e0-40aa-a7db-a0cde5431b83" containerName="dnsmasq-dns" Oct 02 11:16:24 crc kubenswrapper[4835]: E1002 11:16:24.704945 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c187566-73e0-40aa-a7db-a0cde5431b83" containerName="init" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.704952 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c187566-73e0-40aa-a7db-a0cde5431b83" containerName="init" Oct 02 11:16:24 crc kubenswrapper[4835]: E1002 11:16:24.704961 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="ceilometer-central-agent" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.704968 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="ceilometer-central-agent" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.705125 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="proxy-httpd" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.705135 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c187566-73e0-40aa-a7db-a0cde5431b83" containerName="dnsmasq-dns" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.705148 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="sg-core" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.705164 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="ceilometer-notification-agent" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.705174 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" containerName="ceilometer-central-agent" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.706713 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.712444 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.713010 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.713198 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.722358 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.871131 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.871242 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-config-data\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.871287 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.871313 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-run-httpd\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.871363 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-scripts\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.871379 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.871402 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-log-httpd\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.871450 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl9xt\" (UniqueName: \"kubernetes.io/projected/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-kube-api-access-wl9xt\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.972523 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.972578 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-run-httpd\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.972619 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-scripts\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.972639 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.972669 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-log-httpd\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.972728 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl9xt\" (UniqueName: \"kubernetes.io/projected/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-kube-api-access-wl9xt\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.972786 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.972828 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-config-data\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.973597 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-log-httpd\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.974382 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-run-httpd\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.978175 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.978800 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-scripts\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.980303 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-config-data\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.981412 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.989806 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:24 crc kubenswrapper[4835]: I1002 11:16:24.995885 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl9xt\" (UniqueName: \"kubernetes.io/projected/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-kube-api-access-wl9xt\") pod \"ceilometer-0\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " pod="openstack/ceilometer-0" Oct 02 11:16:25 crc kubenswrapper[4835]: I1002 11:16:25.025586 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:25 crc kubenswrapper[4835]: I1002 11:16:25.449065 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:26 crc kubenswrapper[4835]: I1002 11:16:26.263349 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be7ab938-6666-4687-aa3b-61b4da0d7ac6" path="/var/lib/kubelet/pods/be7ab938-6666-4687-aa3b-61b4da0d7ac6/volumes" Oct 02 11:16:26 crc kubenswrapper[4835]: I1002 11:16:26.363617 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed946ae-0d44-4086-92bd-f9e1892c2ec6","Type":"ContainerStarted","Data":"c58947c06444e2c9cd97e8b9e6d35440bbf37c8f7a93ea5ca234626050b1ec43"} Oct 02 11:16:27 crc kubenswrapper[4835]: I1002 11:16:27.382790 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed946ae-0d44-4086-92bd-f9e1892c2ec6","Type":"ContainerStarted","Data":"9fa83c37b66b77e9f1398e86db5e86144f94887c3677b6c7d0583308d5984532"} Oct 02 11:16:28 crc kubenswrapper[4835]: I1002 11:16:28.392938 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed946ae-0d44-4086-92bd-f9e1892c2ec6","Type":"ContainerStarted","Data":"40036333de08c545264bfb1253e45d29b7e4f4a886ea51531c73a8f82ce553c9"} Oct 02 11:16:29 crc kubenswrapper[4835]: I1002 11:16:29.403282 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed946ae-0d44-4086-92bd-f9e1892c2ec6","Type":"ContainerStarted","Data":"e0153bb81430f9f4250e5489e92b337cb89b0154ef70f82bc0f30fe4b26a7944"} Oct 02 11:16:29 crc kubenswrapper[4835]: I1002 11:16:29.698896 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.309700 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-fcjfv"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.312544 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.317907 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.318119 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.331824 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fcjfv"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.487870 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.488544 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-config-data\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.488715 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-scripts\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.488749 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8f7m\" (UniqueName: \"kubernetes.io/projected/28eb8749-6de4-4c76-928c-0f35ce4c378a-kube-api-access-z8f7m\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.529723 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.531488 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.536474 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.567424 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.570593 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.571841 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.586832 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.590771 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-config-data\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.590938 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-scripts\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.590967 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8f7m\" (UniqueName: \"kubernetes.io/projected/28eb8749-6de4-4c76-928c-0f35ce4c378a-kube-api-access-z8f7m\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.590993 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.597688 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-scripts\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.620094 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-config-data\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.624833 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.628946 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.631979 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.640715 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.649904 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8f7m\" (UniqueName: \"kubernetes.io/projected/28eb8749-6de4-4c76-928c-0f35ce4c378a-kube-api-access-z8f7m\") pod \"nova-cell0-cell-mapping-fcjfv\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.693500 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqrqf\" (UniqueName: \"kubernetes.io/projected/a4aa3031-48f9-4215-92cc-d321275d6875-kube-api-access-kqrqf\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.693642 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.693705 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4aa3031-48f9-4215-92cc-d321275d6875-logs\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.693780 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljhlr\" (UniqueName: \"kubernetes.io/projected/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-kube-api-access-ljhlr\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.693829 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-config-data\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.693863 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.693903 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.717319 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.721850 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.738714 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.771111 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.773102 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.776209 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.780829 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.795257 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.795321 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljhlr\" (UniqueName: \"kubernetes.io/projected/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-kube-api-access-ljhlr\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.795353 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sfmq\" (UniqueName: \"kubernetes.io/projected/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-kube-api-access-9sfmq\") pod \"nova-scheduler-0\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.795394 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-config-data\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.795433 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.795466 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.795492 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqrqf\" (UniqueName: \"kubernetes.io/projected/a4aa3031-48f9-4215-92cc-d321275d6875-kube-api-access-kqrqf\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.795517 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-config-data\") pod \"nova-scheduler-0\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.795537 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.795625 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4aa3031-48f9-4215-92cc-d321275d6875-logs\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.796050 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4aa3031-48f9-4215-92cc-d321275d6875-logs\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.807289 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-config-data\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.811529 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.813802 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.818610 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.838886 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljhlr\" (UniqueName: \"kubernetes.io/projected/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-kube-api-access-ljhlr\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.839745 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqrqf\" (UniqueName: \"kubernetes.io/projected/a4aa3031-48f9-4215-92cc-d321275d6875-kube-api-access-kqrqf\") pod \"nova-api-0\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.891360 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.899856 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-config-data\") pod \"nova-scheduler-0\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.899968 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-495nr\" (UniqueName: \"kubernetes.io/projected/653132e3-7688-428a-8fcf-1b241941cb39-kube-api-access-495nr\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.900036 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.900083 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sfmq\" (UniqueName: \"kubernetes.io/projected/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-kube-api-access-9sfmq\") pod \"nova-scheduler-0\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.900118 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653132e3-7688-428a-8fcf-1b241941cb39-logs\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.900138 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.900196 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-config-data\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.904846 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-78gl4"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.908363 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.911996 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-config-data\") pod \"nova-scheduler-0\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.920648 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-78gl4"] Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.945683 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sfmq\" (UniqueName: \"kubernetes.io/projected/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-kube-api-access-9sfmq\") pod \"nova-scheduler-0\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:30 crc kubenswrapper[4835]: I1002 11:16:30.947999 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.002775 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-dns-svc\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.003068 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653132e3-7688-428a-8fcf-1b241941cb39-logs\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.003156 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.003336 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-config-data\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.003488 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-config\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.003632 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk4t2\" (UniqueName: \"kubernetes.io/projected/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-kube-api-access-xk4t2\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.003730 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.003845 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-495nr\" (UniqueName: \"kubernetes.io/projected/653132e3-7688-428a-8fcf-1b241941cb39-kube-api-access-495nr\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.003964 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.004509 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653132e3-7688-428a-8fcf-1b241941cb39-logs\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.013336 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.017547 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-config-data\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.030917 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-495nr\" (UniqueName: \"kubernetes.io/projected/653132e3-7688-428a-8fcf-1b241941cb39-kube-api-access-495nr\") pod \"nova-metadata-0\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " pod="openstack/nova-metadata-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.085095 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.105589 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-dns-svc\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.105702 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-config\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.105758 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk4t2\" (UniqueName: \"kubernetes.io/projected/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-kube-api-access-xk4t2\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.105790 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.105844 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.107167 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-config\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.107666 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-dns-svc\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.108250 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.110300 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.127770 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.132906 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk4t2\" (UniqueName: \"kubernetes.io/projected/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-kube-api-access-xk4t2\") pod \"dnsmasq-dns-566b5b7845-78gl4\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.141314 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.298981 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.366058 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fcjfv"] Oct 02 11:16:31 crc kubenswrapper[4835]: W1002 11:16:31.417950 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28eb8749_6de4_4c76_928c_0f35ce4c378a.slice/crio-7773e7ba690ce2eaaa3c593309d593bef6460d6c2b46a5527c25adc3e7a998c8 WatchSource:0}: Error finding container 7773e7ba690ce2eaaa3c593309d593bef6460d6c2b46a5527c25adc3e7a998c8: Status 404 returned error can't find the container with id 7773e7ba690ce2eaaa3c593309d593bef6460d6c2b46a5527c25adc3e7a998c8 Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.507214 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed946ae-0d44-4086-92bd-f9e1892c2ec6","Type":"ContainerStarted","Data":"1549b45c0c3b39852da347f79e1774916de282b3947c9ecdec990d75563ceda7"} Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.508837 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.521361 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fcjfv" event={"ID":"28eb8749-6de4-4c76-928c-0f35ce4c378a","Type":"ContainerStarted","Data":"7773e7ba690ce2eaaa3c593309d593bef6460d6c2b46a5527c25adc3e7a998c8"} Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.549030 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.818886547 podStartE2EDuration="7.549007437s" podCreationTimestamp="2025-10-02 11:16:24 +0000 UTC" firstStartedPulling="2025-10-02 11:16:25.465641885 +0000 UTC m=+1262.025549466" lastFinishedPulling="2025-10-02 11:16:30.195762775 +0000 UTC m=+1266.755670356" observedRunningTime="2025-10-02 11:16:31.538594366 +0000 UTC m=+1268.098501967" watchObservedRunningTime="2025-10-02 11:16:31.549007437 +0000 UTC m=+1268.108915018" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.645600 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.691905 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7mzxn"] Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.693945 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.697799 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.699401 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7mzxn"] Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.701101 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.775823 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-config-data\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.775896 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bptbf\" (UniqueName: \"kubernetes.io/projected/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-kube-api-access-bptbf\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.775974 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.776042 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-scripts\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.785944 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.887570 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bptbf\" (UniqueName: \"kubernetes.io/projected/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-kube-api-access-bptbf\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.888091 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.888206 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-scripts\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.888319 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-config-data\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.894098 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-config-data\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.894601 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-scripts\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.897147 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: I1002 11:16:31.916718 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bptbf\" (UniqueName: \"kubernetes.io/projected/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-kube-api-access-bptbf\") pod \"nova-cell1-conductor-db-sync-7mzxn\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:31 crc kubenswrapper[4835]: W1002 11:16:31.993529 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod653132e3_7688_428a_8fcf_1b241941cb39.slice/crio-c55b75940b83777c1a067dbaa7fc4b0cc3d9b3999ad1c7dfa789ef9fd709ed87 WatchSource:0}: Error finding container c55b75940b83777c1a067dbaa7fc4b0cc3d9b3999ad1c7dfa789ef9fd709ed87: Status 404 returned error can't find the container with id c55b75940b83777c1a067dbaa7fc4b0cc3d9b3999ad1c7dfa789ef9fd709ed87 Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.002484 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:32 crc kubenswrapper[4835]: W1002 11:16:32.013823 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe66b9c7_49c0_4cdc_94d4_bd9c275c3d59.slice/crio-250c6de4432b288ee3cf61f77e2f08d09ec944a2301d7ab1e31bcaaba7ee5a1f WatchSource:0}: Error finding container 250c6de4432b288ee3cf61f77e2f08d09ec944a2301d7ab1e31bcaaba7ee5a1f: Status 404 returned error can't find the container with id 250c6de4432b288ee3cf61f77e2f08d09ec944a2301d7ab1e31bcaaba7ee5a1f Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.020236 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.041150 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.134940 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-78gl4"] Oct 02 11:16:32 crc kubenswrapper[4835]: W1002 11:16:32.141245 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18af06c7_589a_4ddb_aa4f_92ddfb5ed95d.slice/crio-1581cd2366cf30f69c6d0205918d39498e243376a66fc79aefe42610ff40dd09 WatchSource:0}: Error finding container 1581cd2366cf30f69c6d0205918d39498e243376a66fc79aefe42610ff40dd09: Status 404 returned error can't find the container with id 1581cd2366cf30f69c6d0205918d39498e243376a66fc79aefe42610ff40dd09 Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.534497 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39","Type":"ContainerStarted","Data":"e632f98883c831193e214c7919cad4b534e99b9fe2b9e6e901473905333d1966"} Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.538522 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4aa3031-48f9-4215-92cc-d321275d6875","Type":"ContainerStarted","Data":"49b205c4023cae6916588be49d95e9dcde4d9cea16f120b9ba784129c23f075b"} Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.541081 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" event={"ID":"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d","Type":"ContainerStarted","Data":"8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a"} Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.541145 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" event={"ID":"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d","Type":"ContainerStarted","Data":"1581cd2366cf30f69c6d0205918d39498e243376a66fc79aefe42610ff40dd09"} Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.546178 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59","Type":"ContainerStarted","Data":"250c6de4432b288ee3cf61f77e2f08d09ec944a2301d7ab1e31bcaaba7ee5a1f"} Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.550398 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fcjfv" event={"ID":"28eb8749-6de4-4c76-928c-0f35ce4c378a","Type":"ContainerStarted","Data":"f854c0448719ab5b98713d841698bc7c1e7a9721059c23de721d27109c1336d1"} Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.554684 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"653132e3-7688-428a-8fcf-1b241941cb39","Type":"ContainerStarted","Data":"c55b75940b83777c1a067dbaa7fc4b0cc3d9b3999ad1c7dfa789ef9fd709ed87"} Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.591556 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-fcjfv" podStartSLOduration=2.591535876 podStartE2EDuration="2.591535876s" podCreationTimestamp="2025-10-02 11:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:32.58718754 +0000 UTC m=+1269.147095131" watchObservedRunningTime="2025-10-02 11:16:32.591535876 +0000 UTC m=+1269.151443457" Oct 02 11:16:32 crc kubenswrapper[4835]: I1002 11:16:32.636993 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7mzxn"] Oct 02 11:16:33 crc kubenswrapper[4835]: I1002 11:16:33.030268 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:33 crc kubenswrapper[4835]: I1002 11:16:33.030537 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="7e339cea-b064-4e8b-a147-af1337175248" containerName="nova-cell0-conductor-conductor" containerID="cri-o://29118060e1546bf778268e7a613b73fc51dcaed84f147ec712487e9ba2b25dfb" gracePeriod=30 Oct 02 11:16:33 crc kubenswrapper[4835]: I1002 11:16:33.062294 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:33 crc kubenswrapper[4835]: I1002 11:16:33.070834 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:16:33 crc kubenswrapper[4835]: I1002 11:16:33.084821 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:33 crc kubenswrapper[4835]: I1002 11:16:33.094074 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:16:33 crc kubenswrapper[4835]: I1002 11:16:33.569713 4835 generic.go:334] "Generic (PLEG): container finished" podID="18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" containerID="8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a" exitCode=0 Oct 02 11:16:33 crc kubenswrapper[4835]: I1002 11:16:33.569803 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" event={"ID":"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d","Type":"ContainerDied","Data":"8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a"} Oct 02 11:16:33 crc kubenswrapper[4835]: I1002 11:16:33.582076 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7mzxn" event={"ID":"3f1cca3d-b017-4cc3-875e-41a75f8ee14a","Type":"ContainerStarted","Data":"bf73a1aefa7dcb2c77acb62ad908faee1449eaf34b1a8537c96d8a50b27201fe"} Oct 02 11:16:33 crc kubenswrapper[4835]: I1002 11:16:33.582134 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7mzxn" event={"ID":"3f1cca3d-b017-4cc3-875e-41a75f8ee14a","Type":"ContainerStarted","Data":"376a634b114981f4566695b731fe3a8379b69d7211c81c7a53e8344cda386baf"} Oct 02 11:16:33 crc kubenswrapper[4835]: I1002 11:16:33.633257 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7mzxn" podStartSLOduration=2.63321212 podStartE2EDuration="2.63321212s" podCreationTimestamp="2025-10-02 11:16:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:33.622260014 +0000 UTC m=+1270.182167615" watchObservedRunningTime="2025-10-02 11:16:33.63321212 +0000 UTC m=+1270.193119701" Oct 02 11:16:34 crc kubenswrapper[4835]: I1002 11:16:34.164918 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:34 crc kubenswrapper[4835]: I1002 11:16:34.592858 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="ceilometer-central-agent" containerID="cri-o://9fa83c37b66b77e9f1398e86db5e86144f94887c3677b6c7d0583308d5984532" gracePeriod=30 Oct 02 11:16:34 crc kubenswrapper[4835]: I1002 11:16:34.593161 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="proxy-httpd" containerID="cri-o://1549b45c0c3b39852da347f79e1774916de282b3947c9ecdec990d75563ceda7" gracePeriod=30 Oct 02 11:16:34 crc kubenswrapper[4835]: I1002 11:16:34.593297 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="sg-core" containerID="cri-o://e0153bb81430f9f4250e5489e92b337cb89b0154ef70f82bc0f30fe4b26a7944" gracePeriod=30 Oct 02 11:16:34 crc kubenswrapper[4835]: I1002 11:16:34.593340 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="ceilometer-notification-agent" containerID="cri-o://40036333de08c545264bfb1253e45d29b7e4f4a886ea51531c73a8f82ce553c9" gracePeriod=30 Oct 02 11:16:34 crc kubenswrapper[4835]: E1002 11:16:34.662836 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="29118060e1546bf778268e7a613b73fc51dcaed84f147ec712487e9ba2b25dfb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:16:34 crc kubenswrapper[4835]: E1002 11:16:34.664784 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="29118060e1546bf778268e7a613b73fc51dcaed84f147ec712487e9ba2b25dfb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:16:34 crc kubenswrapper[4835]: E1002 11:16:34.666189 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="29118060e1546bf778268e7a613b73fc51dcaed84f147ec712487e9ba2b25dfb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:16:34 crc kubenswrapper[4835]: E1002 11:16:34.666245 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7e339cea-b064-4e8b-a147-af1337175248" containerName="nova-cell0-conductor-conductor" Oct 02 11:16:35 crc kubenswrapper[4835]: I1002 11:16:35.609489 4835 generic.go:334] "Generic (PLEG): container finished" podID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerID="1549b45c0c3b39852da347f79e1774916de282b3947c9ecdec990d75563ceda7" exitCode=0 Oct 02 11:16:35 crc kubenswrapper[4835]: I1002 11:16:35.609751 4835 generic.go:334] "Generic (PLEG): container finished" podID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerID="e0153bb81430f9f4250e5489e92b337cb89b0154ef70f82bc0f30fe4b26a7944" exitCode=2 Oct 02 11:16:35 crc kubenswrapper[4835]: I1002 11:16:35.609762 4835 generic.go:334] "Generic (PLEG): container finished" podID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerID="40036333de08c545264bfb1253e45d29b7e4f4a886ea51531c73a8f82ce553c9" exitCode=0 Oct 02 11:16:35 crc kubenswrapper[4835]: I1002 11:16:35.609771 4835 generic.go:334] "Generic (PLEG): container finished" podID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerID="9fa83c37b66b77e9f1398e86db5e86144f94887c3677b6c7d0583308d5984532" exitCode=0 Oct 02 11:16:35 crc kubenswrapper[4835]: I1002 11:16:35.609572 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed946ae-0d44-4086-92bd-f9e1892c2ec6","Type":"ContainerDied","Data":"1549b45c0c3b39852da347f79e1774916de282b3947c9ecdec990d75563ceda7"} Oct 02 11:16:35 crc kubenswrapper[4835]: I1002 11:16:35.609807 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed946ae-0d44-4086-92bd-f9e1892c2ec6","Type":"ContainerDied","Data":"e0153bb81430f9f4250e5489e92b337cb89b0154ef70f82bc0f30fe4b26a7944"} Oct 02 11:16:35 crc kubenswrapper[4835]: I1002 11:16:35.609821 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed946ae-0d44-4086-92bd-f9e1892c2ec6","Type":"ContainerDied","Data":"40036333de08c545264bfb1253e45d29b7e4f4a886ea51531c73a8f82ce553c9"} Oct 02 11:16:35 crc kubenswrapper[4835]: I1002 11:16:35.609830 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed946ae-0d44-4086-92bd-f9e1892c2ec6","Type":"ContainerDied","Data":"9fa83c37b66b77e9f1398e86db5e86144f94887c3677b6c7d0583308d5984532"} Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.391781 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.491034 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-combined-ca-bundle\") pod \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.491178 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-run-httpd\") pod \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.491240 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl9xt\" (UniqueName: \"kubernetes.io/projected/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-kube-api-access-wl9xt\") pod \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.491285 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-log-httpd\") pod \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.491369 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-scripts\") pod \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.491407 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-ceilometer-tls-certs\") pod \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.491441 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-config-data\") pod \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.491469 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-sg-core-conf-yaml\") pod \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\" (UID: \"7ed946ae-0d44-4086-92bd-f9e1892c2ec6\") " Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.492842 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7ed946ae-0d44-4086-92bd-f9e1892c2ec6" (UID: "7ed946ae-0d44-4086-92bd-f9e1892c2ec6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.492888 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7ed946ae-0d44-4086-92bd-f9e1892c2ec6" (UID: "7ed946ae-0d44-4086-92bd-f9e1892c2ec6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.505516 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-kube-api-access-wl9xt" (OuterVolumeSpecName: "kube-api-access-wl9xt") pod "7ed946ae-0d44-4086-92bd-f9e1892c2ec6" (UID: "7ed946ae-0d44-4086-92bd-f9e1892c2ec6"). InnerVolumeSpecName "kube-api-access-wl9xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.507650 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-scripts" (OuterVolumeSpecName: "scripts") pod "7ed946ae-0d44-4086-92bd-f9e1892c2ec6" (UID: "7ed946ae-0d44-4086-92bd-f9e1892c2ec6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.580136 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7ed946ae-0d44-4086-92bd-f9e1892c2ec6" (UID: "7ed946ae-0d44-4086-92bd-f9e1892c2ec6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.593877 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.593921 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl9xt\" (UniqueName: \"kubernetes.io/projected/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-kube-api-access-wl9xt\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.593937 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.593949 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.593960 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.618569 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.624526 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7ed946ae-0d44-4086-92bd-f9e1892c2ec6" (UID: "7ed946ae-0d44-4086-92bd-f9e1892c2ec6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.632527 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"653132e3-7688-428a-8fcf-1b241941cb39","Type":"ContainerStarted","Data":"2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad"} Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.632571 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"653132e3-7688-428a-8fcf-1b241941cb39","Type":"ContainerStarted","Data":"0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7"} Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.632663 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="653132e3-7688-428a-8fcf-1b241941cb39" containerName="nova-metadata-metadata" containerID="cri-o://2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad" gracePeriod=30 Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.632658 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="653132e3-7688-428a-8fcf-1b241941cb39" containerName="nova-metadata-log" containerID="cri-o://0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7" gracePeriod=30 Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.640907 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39","Type":"ContainerStarted","Data":"5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e"} Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.641066 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e" gracePeriod=30 Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.651656 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4aa3031-48f9-4215-92cc-d321275d6875","Type":"ContainerStarted","Data":"66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512"} Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.651712 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4aa3031-48f9-4215-92cc-d321275d6875","Type":"ContainerStarted","Data":"c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb"} Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.651856 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a4aa3031-48f9-4215-92cc-d321275d6875" containerName="nova-api-log" containerID="cri-o://c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb" gracePeriod=30 Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.651979 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a4aa3031-48f9-4215-92cc-d321275d6875" containerName="nova-api-api" containerID="cri-o://66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512" gracePeriod=30 Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.701287 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.708627 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" event={"ID":"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d","Type":"ContainerStarted","Data":"48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42"} Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.713243 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.715101 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59","Type":"ContainerStarted","Data":"330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47"} Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.715289 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="be66b9c7-49c0-4cdc-94d4-bd9c275c3d59" containerName="nova-scheduler-scheduler" containerID="cri-o://330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47" gracePeriod=30 Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.716118 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.466290339 podStartE2EDuration="6.716108496s" podCreationTimestamp="2025-10-02 11:16:30 +0000 UTC" firstStartedPulling="2025-10-02 11:16:31.7977175 +0000 UTC m=+1268.357625081" lastFinishedPulling="2025-10-02 11:16:36.047535657 +0000 UTC m=+1272.607443238" observedRunningTime="2025-10-02 11:16:36.711463111 +0000 UTC m=+1273.271370692" watchObservedRunningTime="2025-10-02 11:16:36.716108496 +0000 UTC m=+1273.276016077" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.733407 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ed946ae-0d44-4086-92bd-f9e1892c2ec6" (UID: "7ed946ae-0d44-4086-92bd-f9e1892c2ec6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.733695 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed946ae-0d44-4086-92bd-f9e1892c2ec6","Type":"ContainerDied","Data":"c58947c06444e2c9cd97e8b9e6d35440bbf37c8f7a93ea5ca234626050b1ec43"} Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.733754 4835 scope.go:117] "RemoveContainer" containerID="1549b45c0c3b39852da347f79e1774916de282b3947c9ecdec990d75563ceda7" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.733985 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.744526 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.683582895 podStartE2EDuration="6.744506236s" podCreationTimestamp="2025-10-02 11:16:30 +0000 UTC" firstStartedPulling="2025-10-02 11:16:31.998640843 +0000 UTC m=+1268.558548434" lastFinishedPulling="2025-10-02 11:16:36.059564194 +0000 UTC m=+1272.619471775" observedRunningTime="2025-10-02 11:16:36.742811237 +0000 UTC m=+1273.302718838" watchObservedRunningTime="2025-10-02 11:16:36.744506236 +0000 UTC m=+1273.304413827" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.769666 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.386496685 podStartE2EDuration="6.769649782s" podCreationTimestamp="2025-10-02 11:16:30 +0000 UTC" firstStartedPulling="2025-10-02 11:16:31.675108029 +0000 UTC m=+1268.235015610" lastFinishedPulling="2025-10-02 11:16:36.058261126 +0000 UTC m=+1272.618168707" observedRunningTime="2025-10-02 11:16:36.767532871 +0000 UTC m=+1273.327440452" watchObservedRunningTime="2025-10-02 11:16:36.769649782 +0000 UTC m=+1273.329557363" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.824700 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-config-data" (OuterVolumeSpecName: "config-data") pod "7ed946ae-0d44-4086-92bd-f9e1892c2ec6" (UID: "7ed946ae-0d44-4086-92bd-f9e1892c2ec6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.833752 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.795686742 podStartE2EDuration="6.833723472s" podCreationTimestamp="2025-10-02 11:16:30 +0000 UTC" firstStartedPulling="2025-10-02 11:16:32.020400881 +0000 UTC m=+1268.580308462" lastFinishedPulling="2025-10-02 11:16:36.058437611 +0000 UTC m=+1272.618345192" observedRunningTime="2025-10-02 11:16:36.82949347 +0000 UTC m=+1273.389401051" watchObservedRunningTime="2025-10-02 11:16:36.833723472 +0000 UTC m=+1273.393631053" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.846845 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.847941 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed946ae-0d44-4086-92bd-f9e1892c2ec6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.876623 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" podStartSLOduration=6.876350563 podStartE2EDuration="6.876350563s" podCreationTimestamp="2025-10-02 11:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:36.871704989 +0000 UTC m=+1273.431612580" watchObservedRunningTime="2025-10-02 11:16:36.876350563 +0000 UTC m=+1273.436258144" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.938185 4835 scope.go:117] "RemoveContainer" containerID="e0153bb81430f9f4250e5489e92b337cb89b0154ef70f82bc0f30fe4b26a7944" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.962615 4835 scope.go:117] "RemoveContainer" containerID="40036333de08c545264bfb1253e45d29b7e4f4a886ea51531c73a8f82ce553c9" Oct 02 11:16:36 crc kubenswrapper[4835]: I1002 11:16:36.990577 4835 scope.go:117] "RemoveContainer" containerID="9fa83c37b66b77e9f1398e86db5e86144f94887c3677b6c7d0583308d5984532" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.075353 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.090354 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.101304 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:37 crc kubenswrapper[4835]: E1002 11:16:37.101862 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="ceilometer-central-agent" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.101887 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="ceilometer-central-agent" Oct 02 11:16:37 crc kubenswrapper[4835]: E1002 11:16:37.101919 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="ceilometer-notification-agent" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.101931 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="ceilometer-notification-agent" Oct 02 11:16:37 crc kubenswrapper[4835]: E1002 11:16:37.101964 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="sg-core" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.101971 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="sg-core" Oct 02 11:16:37 crc kubenswrapper[4835]: E1002 11:16:37.101989 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="proxy-httpd" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.101995 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="proxy-httpd" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.102197 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="ceilometer-notification-agent" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.102244 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="ceilometer-central-agent" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.102271 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="proxy-httpd" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.102281 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" containerName="sg-core" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.107829 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.113187 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.113417 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.113519 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.114081 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.168701 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-run-httpd\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.168773 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-scripts\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.168875 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-log-httpd\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.168913 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-config-data\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.169137 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.169186 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.169274 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.169403 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf9n7\" (UniqueName: \"kubernetes.io/projected/c55deccf-4e69-437c-96a0-ff5f8200acad-kube-api-access-zf9n7\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.270874 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf9n7\" (UniqueName: \"kubernetes.io/projected/c55deccf-4e69-437c-96a0-ff5f8200acad-kube-api-access-zf9n7\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.270955 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-run-httpd\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.270978 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-scripts\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.271029 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-log-httpd\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.271047 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-config-data\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.271093 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.271114 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.271140 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.271492 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-run-httpd\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.271510 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-log-httpd\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.275271 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.276894 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.277038 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-scripts\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.277390 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.277405 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-config-data\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.293959 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf9n7\" (UniqueName: \"kubernetes.io/projected/c55deccf-4e69-437c-96a0-ff5f8200acad-kube-api-access-zf9n7\") pod \"ceilometer-0\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.453256 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.752127 4835 generic.go:334] "Generic (PLEG): container finished" podID="7e339cea-b064-4e8b-a147-af1337175248" containerID="29118060e1546bf778268e7a613b73fc51dcaed84f147ec712487e9ba2b25dfb" exitCode=0 Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.752232 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7e339cea-b064-4e8b-a147-af1337175248","Type":"ContainerDied","Data":"29118060e1546bf778268e7a613b73fc51dcaed84f147ec712487e9ba2b25dfb"} Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.758359 4835 generic.go:334] "Generic (PLEG): container finished" podID="a4aa3031-48f9-4215-92cc-d321275d6875" containerID="c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb" exitCode=143 Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.758452 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4aa3031-48f9-4215-92cc-d321275d6875","Type":"ContainerDied","Data":"c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb"} Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.764381 4835 generic.go:334] "Generic (PLEG): container finished" podID="653132e3-7688-428a-8fcf-1b241941cb39" containerID="0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7" exitCode=143 Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.765089 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"653132e3-7688-428a-8fcf-1b241941cb39","Type":"ContainerDied","Data":"0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7"} Oct 02 11:16:37 crc kubenswrapper[4835]: I1002 11:16:37.971328 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.137445 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.195335 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-combined-ca-bundle\") pod \"7e339cea-b064-4e8b-a147-af1337175248\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.195431 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdsw2\" (UniqueName: \"kubernetes.io/projected/7e339cea-b064-4e8b-a147-af1337175248-kube-api-access-zdsw2\") pod \"7e339cea-b064-4e8b-a147-af1337175248\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.195611 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-config-data\") pod \"7e339cea-b064-4e8b-a147-af1337175248\" (UID: \"7e339cea-b064-4e8b-a147-af1337175248\") " Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.207484 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e339cea-b064-4e8b-a147-af1337175248-kube-api-access-zdsw2" (OuterVolumeSpecName: "kube-api-access-zdsw2") pod "7e339cea-b064-4e8b-a147-af1337175248" (UID: "7e339cea-b064-4e8b-a147-af1337175248"). InnerVolumeSpecName "kube-api-access-zdsw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.233079 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e339cea-b064-4e8b-a147-af1337175248" (UID: "7e339cea-b064-4e8b-a147-af1337175248"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.238278 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-config-data" (OuterVolumeSpecName: "config-data") pod "7e339cea-b064-4e8b-a147-af1337175248" (UID: "7e339cea-b064-4e8b-a147-af1337175248"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.266079 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed946ae-0d44-4086-92bd-f9e1892c2ec6" path="/var/lib/kubelet/pods/7ed946ae-0d44-4086-92bd-f9e1892c2ec6/volumes" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.298126 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.298161 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e339cea-b064-4e8b-a147-af1337175248-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.298171 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdsw2\" (UniqueName: \"kubernetes.io/projected/7e339cea-b064-4e8b-a147-af1337175248-kube-api-access-zdsw2\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.788160 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c55deccf-4e69-437c-96a0-ff5f8200acad","Type":"ContainerStarted","Data":"1ca52b1872c2b55bbbbfcf731c92d3abdcc2ad21260da534db7ec416f7535975"} Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.790882 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7e339cea-b064-4e8b-a147-af1337175248","Type":"ContainerDied","Data":"020874ee770c1128a973b7b4f7c05b37e18547b6c50a0393722d3e14ae6754d8"} Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.790940 4835 scope.go:117] "RemoveContainer" containerID="29118060e1546bf778268e7a613b73fc51dcaed84f147ec712487e9ba2b25dfb" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.790958 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.963997 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.971657 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.988987 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:38 crc kubenswrapper[4835]: E1002 11:16:38.989386 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e339cea-b064-4e8b-a147-af1337175248" containerName="nova-cell0-conductor-conductor" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.989404 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e339cea-b064-4e8b-a147-af1337175248" containerName="nova-cell0-conductor-conductor" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.989578 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e339cea-b064-4e8b-a147-af1337175248" containerName="nova-cell0-conductor-conductor" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.990241 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:38 crc kubenswrapper[4835]: I1002 11:16:38.993696 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.007409 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.114076 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17d51a8-a2be-44db-ae3f-92a2110b34be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f17d51a8-a2be-44db-ae3f-92a2110b34be\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.114176 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f17d51a8-a2be-44db-ae3f-92a2110b34be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f17d51a8-a2be-44db-ae3f-92a2110b34be\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.114352 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f86md\" (UniqueName: \"kubernetes.io/projected/f17d51a8-a2be-44db-ae3f-92a2110b34be-kube-api-access-f86md\") pod \"nova-cell0-conductor-0\" (UID: \"f17d51a8-a2be-44db-ae3f-92a2110b34be\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.211153 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d568cc985-z84bp" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.216976 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17d51a8-a2be-44db-ae3f-92a2110b34be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f17d51a8-a2be-44db-ae3f-92a2110b34be\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.217738 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f17d51a8-a2be-44db-ae3f-92a2110b34be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f17d51a8-a2be-44db-ae3f-92a2110b34be\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.217845 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f86md\" (UniqueName: \"kubernetes.io/projected/f17d51a8-a2be-44db-ae3f-92a2110b34be-kube-api-access-f86md\") pod \"nova-cell0-conductor-0\" (UID: \"f17d51a8-a2be-44db-ae3f-92a2110b34be\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.222629 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17d51a8-a2be-44db-ae3f-92a2110b34be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f17d51a8-a2be-44db-ae3f-92a2110b34be\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.222934 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f17d51a8-a2be-44db-ae3f-92a2110b34be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f17d51a8-a2be-44db-ae3f-92a2110b34be\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.252763 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f86md\" (UniqueName: \"kubernetes.io/projected/f17d51a8-a2be-44db-ae3f-92a2110b34be-kube-api-access-f86md\") pod \"nova-cell0-conductor-0\" (UID: \"f17d51a8-a2be-44db-ae3f-92a2110b34be\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.277691 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76cd984794-hg967"] Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.278086 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76cd984794-hg967" podUID="e02806da-6104-48e3-8d94-db8475e53b68" containerName="neutron-api" containerID="cri-o://6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582" gracePeriod=30 Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.278198 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76cd984794-hg967" podUID="e02806da-6104-48e3-8d94-db8475e53b68" containerName="neutron-httpd" containerID="cri-o://89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a" gracePeriod=30 Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.321656 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.814204 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c55deccf-4e69-437c-96a0-ff5f8200acad","Type":"ContainerStarted","Data":"15a28b4dada598d6695e48ab3b27a006a69d2c145d5661023700694f531efe80"} Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.823475 4835 generic.go:334] "Generic (PLEG): container finished" podID="e02806da-6104-48e3-8d94-db8475e53b68" containerID="89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a" exitCode=0 Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.823609 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cd984794-hg967" event={"ID":"e02806da-6104-48e3-8d94-db8475e53b68","Type":"ContainerDied","Data":"89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a"} Oct 02 11:16:39 crc kubenswrapper[4835]: I1002 11:16:39.907498 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:40 crc kubenswrapper[4835]: I1002 11:16:40.265595 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e339cea-b064-4e8b-a147-af1337175248" path="/var/lib/kubelet/pods/7e339cea-b064-4e8b-a147-af1337175248/volumes" Oct 02 11:16:40 crc kubenswrapper[4835]: I1002 11:16:40.848501 4835 generic.go:334] "Generic (PLEG): container finished" podID="28eb8749-6de4-4c76-928c-0f35ce4c378a" containerID="f854c0448719ab5b98713d841698bc7c1e7a9721059c23de721d27109c1336d1" exitCode=0 Oct 02 11:16:40 crc kubenswrapper[4835]: I1002 11:16:40.848751 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fcjfv" event={"ID":"28eb8749-6de4-4c76-928c-0f35ce4c378a","Type":"ContainerDied","Data":"f854c0448719ab5b98713d841698bc7c1e7a9721059c23de721d27109c1336d1"} Oct 02 11:16:40 crc kubenswrapper[4835]: I1002 11:16:40.851501 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c55deccf-4e69-437c-96a0-ff5f8200acad","Type":"ContainerStarted","Data":"b01a12e81a4c19d31d6b531b95cb02b753130af1ae872d5283d8ed399866c9d7"} Oct 02 11:16:40 crc kubenswrapper[4835]: I1002 11:16:40.851544 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c55deccf-4e69-437c-96a0-ff5f8200acad","Type":"ContainerStarted","Data":"8ac6d400448de7d27c3d10a316dcb417bb140ccf5ad576d32452cdd647402c21"} Oct 02 11:16:40 crc kubenswrapper[4835]: I1002 11:16:40.857653 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f17d51a8-a2be-44db-ae3f-92a2110b34be","Type":"ContainerStarted","Data":"1236fb6c91ef2e9ac39d546211f9a31c664801293cf237580abd30a68d93de9a"} Oct 02 11:16:40 crc kubenswrapper[4835]: I1002 11:16:40.857698 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f17d51a8-a2be-44db-ae3f-92a2110b34be","Type":"ContainerStarted","Data":"cee6a17053ed1aa2a7be3b918ca45d3216152ac5b21e9d7d3a087ad758e79a4e"} Oct 02 11:16:40 crc kubenswrapper[4835]: I1002 11:16:40.858519 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:40 crc kubenswrapper[4835]: I1002 11:16:40.895753 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.895731815 podStartE2EDuration="2.895731815s" podCreationTimestamp="2025-10-02 11:16:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:40.894984314 +0000 UTC m=+1277.454891915" watchObservedRunningTime="2025-10-02 11:16:40.895731815 +0000 UTC m=+1277.455639396" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.085310 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.129319 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.141953 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.142008 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.302453 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.379844 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-9tspq"] Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.380099 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" podUID="c0074a72-5ca4-452b-8c69-f8ad99e14a0f" containerName="dnsmasq-dns" containerID="cri-o://b3754b5f33119b5d94162cdee22794bc3130c3996f67195a78f0dfc70fa52cb4" gracePeriod=10 Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.482302 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" podUID="c0074a72-5ca4-452b-8c69-f8ad99e14a0f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.164:5353: connect: connection refused" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.876677 4835 generic.go:334] "Generic (PLEG): container finished" podID="c0074a72-5ca4-452b-8c69-f8ad99e14a0f" containerID="b3754b5f33119b5d94162cdee22794bc3130c3996f67195a78f0dfc70fa52cb4" exitCode=0 Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.876789 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" event={"ID":"c0074a72-5ca4-452b-8c69-f8ad99e14a0f","Type":"ContainerDied","Data":"b3754b5f33119b5d94162cdee22794bc3130c3996f67195a78f0dfc70fa52cb4"} Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.876819 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" event={"ID":"c0074a72-5ca4-452b-8c69-f8ad99e14a0f","Type":"ContainerDied","Data":"5bb94e4c6f83eb31527fa62b4eba4d8c7a381b53ca0485935f77273ca8d192aa"} Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.876829 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bb94e4c6f83eb31527fa62b4eba4d8c7a381b53ca0485935f77273ca8d192aa" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.880854 4835 generic.go:334] "Generic (PLEG): container finished" podID="3f1cca3d-b017-4cc3-875e-41a75f8ee14a" containerID="bf73a1aefa7dcb2c77acb62ad908faee1449eaf34b1a8537c96d8a50b27201fe" exitCode=0 Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.881662 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7mzxn" event={"ID":"3f1cca3d-b017-4cc3-875e-41a75f8ee14a","Type":"ContainerDied","Data":"bf73a1aefa7dcb2c77acb62ad908faee1449eaf34b1a8537c96d8a50b27201fe"} Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.923241 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.984369 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.984430 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.984487 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.985290 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ece753eadbbcedfb37995a0897d789fdf5c6660566a042ca4a738c29f1789c89"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.985360 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://ece753eadbbcedfb37995a0897d789fdf5c6660566a042ca4a738c29f1789c89" gracePeriod=600 Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.992854 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-nb\") pod \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.993013 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-sb\") pod \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.993090 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-config\") pod \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.993241 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-dns-svc\") pod \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " Oct 02 11:16:41 crc kubenswrapper[4835]: I1002 11:16:41.993328 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcdb2\" (UniqueName: \"kubernetes.io/projected/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-kube-api-access-gcdb2\") pod \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\" (UID: \"c0074a72-5ca4-452b-8c69-f8ad99e14a0f\") " Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.019918 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-kube-api-access-gcdb2" (OuterVolumeSpecName: "kube-api-access-gcdb2") pod "c0074a72-5ca4-452b-8c69-f8ad99e14a0f" (UID: "c0074a72-5ca4-452b-8c69-f8ad99e14a0f"). InnerVolumeSpecName "kube-api-access-gcdb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.088492 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c0074a72-5ca4-452b-8c69-f8ad99e14a0f" (UID: "c0074a72-5ca4-452b-8c69-f8ad99e14a0f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.093718 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-config" (OuterVolumeSpecName: "config") pod "c0074a72-5ca4-452b-8c69-f8ad99e14a0f" (UID: "c0074a72-5ca4-452b-8c69-f8ad99e14a0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.096525 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcdb2\" (UniqueName: \"kubernetes.io/projected/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-kube-api-access-gcdb2\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.096597 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.096613 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.128579 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c0074a72-5ca4-452b-8c69-f8ad99e14a0f" (UID: "c0074a72-5ca4-452b-8c69-f8ad99e14a0f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.129689 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0074a72-5ca4-452b-8c69-f8ad99e14a0f" (UID: "c0074a72-5ca4-452b-8c69-f8ad99e14a0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.197950 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.197996 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0074a72-5ca4-452b-8c69-f8ad99e14a0f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.227185 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.299107 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-combined-ca-bundle\") pod \"28eb8749-6de4-4c76-928c-0f35ce4c378a\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.299289 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8f7m\" (UniqueName: \"kubernetes.io/projected/28eb8749-6de4-4c76-928c-0f35ce4c378a-kube-api-access-z8f7m\") pod \"28eb8749-6de4-4c76-928c-0f35ce4c378a\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.299361 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-config-data\") pod \"28eb8749-6de4-4c76-928c-0f35ce4c378a\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.299392 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-scripts\") pod \"28eb8749-6de4-4c76-928c-0f35ce4c378a\" (UID: \"28eb8749-6de4-4c76-928c-0f35ce4c378a\") " Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.305590 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28eb8749-6de4-4c76-928c-0f35ce4c378a-kube-api-access-z8f7m" (OuterVolumeSpecName: "kube-api-access-z8f7m") pod "28eb8749-6de4-4c76-928c-0f35ce4c378a" (UID: "28eb8749-6de4-4c76-928c-0f35ce4c378a"). InnerVolumeSpecName "kube-api-access-z8f7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.326376 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-scripts" (OuterVolumeSpecName: "scripts") pod "28eb8749-6de4-4c76-928c-0f35ce4c378a" (UID: "28eb8749-6de4-4c76-928c-0f35ce4c378a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.353930 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-config-data" (OuterVolumeSpecName: "config-data") pod "28eb8749-6de4-4c76-928c-0f35ce4c378a" (UID: "28eb8749-6de4-4c76-928c-0f35ce4c378a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.355473 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28eb8749-6de4-4c76-928c-0f35ce4c378a" (UID: "28eb8749-6de4-4c76-928c-0f35ce4c378a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.401923 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.401965 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.401976 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28eb8749-6de4-4c76-928c-0f35ce4c378a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.401987 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8f7m\" (UniqueName: \"kubernetes.io/projected/28eb8749-6de4-4c76-928c-0f35ce4c378a-kube-api-access-z8f7m\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.893865 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="ece753eadbbcedfb37995a0897d789fdf5c6660566a042ca4a738c29f1789c89" exitCode=0 Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.895066 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"ece753eadbbcedfb37995a0897d789fdf5c6660566a042ca4a738c29f1789c89"} Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.895175 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"74bda1206c6cef4b94a808d597fd0b18bc43e5697e9459d0f58f3237db138b7b"} Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.895275 4835 scope.go:117] "RemoveContainer" containerID="2d0ce126b4147f93bbd26e8b66aa4ae542cae2002f8fd305bd26ccf1276aba52" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.903880 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fcjfv" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.903868 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fcjfv" event={"ID":"28eb8749-6de4-4c76-928c-0f35ce4c378a","Type":"ContainerDied","Data":"7773e7ba690ce2eaaa3c593309d593bef6460d6c2b46a5527c25adc3e7a998c8"} Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.904002 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7773e7ba690ce2eaaa3c593309d593bef6460d6c2b46a5527c25adc3e7a998c8" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.906767 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c55deccf-4e69-437c-96a0-ff5f8200acad","Type":"ContainerStarted","Data":"601466600cef5e7769a88427f9f413418f305505ce12141a1bce2b4155289332"} Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.907457 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-9tspq" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.913896 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:16:42 crc kubenswrapper[4835]: I1002 11:16:42.968887 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.628740122 podStartE2EDuration="5.968835607s" podCreationTimestamp="2025-10-02 11:16:37 +0000 UTC" firstStartedPulling="2025-10-02 11:16:37.995669009 +0000 UTC m=+1274.555576590" lastFinishedPulling="2025-10-02 11:16:42.335764494 +0000 UTC m=+1278.895672075" observedRunningTime="2025-10-02 11:16:42.956696327 +0000 UTC m=+1279.516603918" watchObservedRunningTime="2025-10-02 11:16:42.968835607 +0000 UTC m=+1279.528743188" Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.003558 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-9tspq"] Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.014320 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-9tspq"] Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.337562 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.447480 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-scripts\") pod \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.447678 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-combined-ca-bundle\") pod \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.447758 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bptbf\" (UniqueName: \"kubernetes.io/projected/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-kube-api-access-bptbf\") pod \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.447867 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-config-data\") pod \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\" (UID: \"3f1cca3d-b017-4cc3-875e-41a75f8ee14a\") " Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.453476 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-kube-api-access-bptbf" (OuterVolumeSpecName: "kube-api-access-bptbf") pod "3f1cca3d-b017-4cc3-875e-41a75f8ee14a" (UID: "3f1cca3d-b017-4cc3-875e-41a75f8ee14a"). InnerVolumeSpecName "kube-api-access-bptbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.453898 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-scripts" (OuterVolumeSpecName: "scripts") pod "3f1cca3d-b017-4cc3-875e-41a75f8ee14a" (UID: "3f1cca3d-b017-4cc3-875e-41a75f8ee14a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.475858 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-config-data" (OuterVolumeSpecName: "config-data") pod "3f1cca3d-b017-4cc3-875e-41a75f8ee14a" (UID: "3f1cca3d-b017-4cc3-875e-41a75f8ee14a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.477837 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f1cca3d-b017-4cc3-875e-41a75f8ee14a" (UID: "3f1cca3d-b017-4cc3-875e-41a75f8ee14a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.550592 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.550635 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.550649 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.550661 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bptbf\" (UniqueName: \"kubernetes.io/projected/3f1cca3d-b017-4cc3-875e-41a75f8ee14a-kube-api-access-bptbf\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.928807 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7mzxn" Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.932319 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7mzxn" event={"ID":"3f1cca3d-b017-4cc3-875e-41a75f8ee14a","Type":"ContainerDied","Data":"376a634b114981f4566695b731fe3a8379b69d7211c81c7a53e8344cda386baf"} Oct 02 11:16:43 crc kubenswrapper[4835]: I1002 11:16:43.932363 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="376a634b114981f4566695b731fe3a8379b69d7211c81c7a53e8344cda386baf" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.005312 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:16:44 crc kubenswrapper[4835]: E1002 11:16:44.005959 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28eb8749-6de4-4c76-928c-0f35ce4c378a" containerName="nova-manage" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.005973 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="28eb8749-6de4-4c76-928c-0f35ce4c378a" containerName="nova-manage" Oct 02 11:16:44 crc kubenswrapper[4835]: E1002 11:16:44.005992 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0074a72-5ca4-452b-8c69-f8ad99e14a0f" containerName="dnsmasq-dns" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.005997 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0074a72-5ca4-452b-8c69-f8ad99e14a0f" containerName="dnsmasq-dns" Oct 02 11:16:44 crc kubenswrapper[4835]: E1002 11:16:44.006009 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0074a72-5ca4-452b-8c69-f8ad99e14a0f" containerName="init" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.006015 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0074a72-5ca4-452b-8c69-f8ad99e14a0f" containerName="init" Oct 02 11:16:44 crc kubenswrapper[4835]: E1002 11:16:44.006027 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f1cca3d-b017-4cc3-875e-41a75f8ee14a" containerName="nova-cell1-conductor-db-sync" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.006033 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f1cca3d-b017-4cc3-875e-41a75f8ee14a" containerName="nova-cell1-conductor-db-sync" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.006212 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f1cca3d-b017-4cc3-875e-41a75f8ee14a" containerName="nova-cell1-conductor-db-sync" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.006245 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="28eb8749-6de4-4c76-928c-0f35ce4c378a" containerName="nova-manage" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.006261 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0074a72-5ca4-452b-8c69-f8ad99e14a0f" containerName="dnsmasq-dns" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.006909 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.011200 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.016489 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.060515 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac27bc0-d222-4588-88b6-d354949459a2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bac27bc0-d222-4588-88b6-d354949459a2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.060663 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pld55\" (UniqueName: \"kubernetes.io/projected/bac27bc0-d222-4588-88b6-d354949459a2-kube-api-access-pld55\") pod \"nova-cell1-conductor-0\" (UID: \"bac27bc0-d222-4588-88b6-d354949459a2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.060685 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac27bc0-d222-4588-88b6-d354949459a2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bac27bc0-d222-4588-88b6-d354949459a2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.162074 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac27bc0-d222-4588-88b6-d354949459a2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bac27bc0-d222-4588-88b6-d354949459a2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.162187 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pld55\" (UniqueName: \"kubernetes.io/projected/bac27bc0-d222-4588-88b6-d354949459a2-kube-api-access-pld55\") pod \"nova-cell1-conductor-0\" (UID: \"bac27bc0-d222-4588-88b6-d354949459a2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.162208 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac27bc0-d222-4588-88b6-d354949459a2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bac27bc0-d222-4588-88b6-d354949459a2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.166339 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac27bc0-d222-4588-88b6-d354949459a2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bac27bc0-d222-4588-88b6-d354949459a2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.167005 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac27bc0-d222-4588-88b6-d354949459a2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bac27bc0-d222-4588-88b6-d354949459a2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.180308 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pld55\" (UniqueName: \"kubernetes.io/projected/bac27bc0-d222-4588-88b6-d354949459a2-kube-api-access-pld55\") pod \"nova-cell1-conductor-0\" (UID: \"bac27bc0-d222-4588-88b6-d354949459a2\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.269997 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0074a72-5ca4-452b-8c69-f8ad99e14a0f" path="/var/lib/kubelet/pods/c0074a72-5ca4-452b-8c69-f8ad99e14a0f/volumes" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.335809 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.811843 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:16:44 crc kubenswrapper[4835]: W1002 11:16:44.816066 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbac27bc0_d222_4588_88b6_d354949459a2.slice/crio-afd19d6446af971a85a2e68acf674494ec34f22ee6583517ce096a75cda7a286 WatchSource:0}: Error finding container afd19d6446af971a85a2e68acf674494ec34f22ee6583517ce096a75cda7a286: Status 404 returned error can't find the container with id afd19d6446af971a85a2e68acf674494ec34f22ee6583517ce096a75cda7a286 Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.912126 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.944756 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bac27bc0-d222-4588-88b6-d354949459a2","Type":"ContainerStarted","Data":"afd19d6446af971a85a2e68acf674494ec34f22ee6583517ce096a75cda7a286"} Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.947166 4835 generic.go:334] "Generic (PLEG): container finished" podID="e02806da-6104-48e3-8d94-db8475e53b68" containerID="6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582" exitCode=0 Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.947322 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cd984794-hg967" event={"ID":"e02806da-6104-48e3-8d94-db8475e53b68","Type":"ContainerDied","Data":"6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582"} Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.947353 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76cd984794-hg967" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.947368 4835 scope.go:117] "RemoveContainer" containerID="89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.947356 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cd984794-hg967" event={"ID":"e02806da-6104-48e3-8d94-db8475e53b68","Type":"ContainerDied","Data":"4328d3a8c34794fb2d6b8db0fe5f7a4643fcd1f3355f34fbb8d98c9f351c5d22"} Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.972831 4835 scope.go:117] "RemoveContainer" containerID="6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582" Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.991285 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-httpd-config\") pod \"e02806da-6104-48e3-8d94-db8475e53b68\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.991539 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-config\") pod \"e02806da-6104-48e3-8d94-db8475e53b68\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.991573 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-ovndb-tls-certs\") pod \"e02806da-6104-48e3-8d94-db8475e53b68\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.991717 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-combined-ca-bundle\") pod \"e02806da-6104-48e3-8d94-db8475e53b68\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " Oct 02 11:16:44 crc kubenswrapper[4835]: I1002 11:16:44.992001 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzd2n\" (UniqueName: \"kubernetes.io/projected/e02806da-6104-48e3-8d94-db8475e53b68-kube-api-access-bzd2n\") pod \"e02806da-6104-48e3-8d94-db8475e53b68\" (UID: \"e02806da-6104-48e3-8d94-db8475e53b68\") " Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.016723 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e02806da-6104-48e3-8d94-db8475e53b68" (UID: "e02806da-6104-48e3-8d94-db8475e53b68"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.024197 4835 scope.go:117] "RemoveContainer" containerID="89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a" Oct 02 11:16:45 crc kubenswrapper[4835]: E1002 11:16:45.025256 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a\": container with ID starting with 89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a not found: ID does not exist" containerID="89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.025290 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a"} err="failed to get container status \"89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a\": rpc error: code = NotFound desc = could not find container \"89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a\": container with ID starting with 89c285fd01bfed002ee50ea6f94d883971b8b6c1a1492c7437449f68e580c79a not found: ID does not exist" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.025316 4835 scope.go:117] "RemoveContainer" containerID="6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582" Oct 02 11:16:45 crc kubenswrapper[4835]: E1002 11:16:45.027288 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582\": container with ID starting with 6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582 not found: ID does not exist" containerID="6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.027336 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582"} err="failed to get container status \"6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582\": rpc error: code = NotFound desc = could not find container \"6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582\": container with ID starting with 6ad107af762b3968d4030f3aeb42928c93c43695f9d93a8787401ea931354582 not found: ID does not exist" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.031841 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e02806da-6104-48e3-8d94-db8475e53b68-kube-api-access-bzd2n" (OuterVolumeSpecName: "kube-api-access-bzd2n") pod "e02806da-6104-48e3-8d94-db8475e53b68" (UID: "e02806da-6104-48e3-8d94-db8475e53b68"). InnerVolumeSpecName "kube-api-access-bzd2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.079207 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e02806da-6104-48e3-8d94-db8475e53b68" (UID: "e02806da-6104-48e3-8d94-db8475e53b68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.095067 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.095102 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzd2n\" (UniqueName: \"kubernetes.io/projected/e02806da-6104-48e3-8d94-db8475e53b68-kube-api-access-bzd2n\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.095117 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.096048 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-config" (OuterVolumeSpecName: "config") pod "e02806da-6104-48e3-8d94-db8475e53b68" (UID: "e02806da-6104-48e3-8d94-db8475e53b68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.102291 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e02806da-6104-48e3-8d94-db8475e53b68" (UID: "e02806da-6104-48e3-8d94-db8475e53b68"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.196722 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.196758 4835 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e02806da-6104-48e3-8d94-db8475e53b68-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.294257 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76cd984794-hg967"] Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.303515 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-76cd984794-hg967"] Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.957548 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bac27bc0-d222-4588-88b6-d354949459a2","Type":"ContainerStarted","Data":"ca5ffe98b3b0dc8743b1274c907dd9585759b89292c110cb81aaf32cf9dcd887"} Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.957634 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:45 crc kubenswrapper[4835]: I1002 11:16:45.981639 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.981617628 podStartE2EDuration="2.981617628s" podCreationTimestamp="2025-10-02 11:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:45.972758392 +0000 UTC m=+1282.532665973" watchObservedRunningTime="2025-10-02 11:16:45.981617628 +0000 UTC m=+1282.541525209" Oct 02 11:16:46 crc kubenswrapper[4835]: I1002 11:16:46.270769 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e02806da-6104-48e3-8d94-db8475e53b68" path="/var/lib/kubelet/pods/e02806da-6104-48e3-8d94-db8475e53b68/volumes" Oct 02 11:16:49 crc kubenswrapper[4835]: I1002 11:16:49.350240 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:54 crc kubenswrapper[4835]: I1002 11:16:54.360246 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 11:17:00 crc kubenswrapper[4835]: I1002 11:17:00.893191 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:17:00 crc kubenswrapper[4835]: I1002 11:17:00.893628 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.110853 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.117820 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.127374 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.163280 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.205477 4835 generic.go:334] "Generic (PLEG): container finished" podID="be66b9c7-49c0-4cdc-94d4-bd9c275c3d59" containerID="330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47" exitCode=137 Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.205568 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59","Type":"ContainerDied","Data":"330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47"} Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.205604 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59","Type":"ContainerDied","Data":"250c6de4432b288ee3cf61f77e2f08d09ec944a2301d7ab1e31bcaaba7ee5a1f"} Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.205627 4835 scope.go:117] "RemoveContainer" containerID="330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.205795 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.219006 4835 generic.go:334] "Generic (PLEG): container finished" podID="653132e3-7688-428a-8fcf-1b241941cb39" containerID="2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad" exitCode=137 Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.219081 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"653132e3-7688-428a-8fcf-1b241941cb39","Type":"ContainerDied","Data":"2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad"} Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.219100 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.219107 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"653132e3-7688-428a-8fcf-1b241941cb39","Type":"ContainerDied","Data":"c55b75940b83777c1a067dbaa7fc4b0cc3d9b3999ad1c7dfa789ef9fd709ed87"} Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.221037 4835 generic.go:334] "Generic (PLEG): container finished" podID="d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39" containerID="5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e" exitCode=137 Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.221103 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.221109 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39","Type":"ContainerDied","Data":"5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e"} Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.221138 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39","Type":"ContainerDied","Data":"e632f98883c831193e214c7919cad4b534e99b9fe2b9e6e901473905333d1966"} Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.223897 4835 generic.go:334] "Generic (PLEG): container finished" podID="a4aa3031-48f9-4215-92cc-d321275d6875" containerID="66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512" exitCode=137 Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.223944 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4aa3031-48f9-4215-92cc-d321275d6875","Type":"ContainerDied","Data":"66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512"} Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.223971 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4aa3031-48f9-4215-92cc-d321275d6875","Type":"ContainerDied","Data":"49b205c4023cae6916588be49d95e9dcde4d9cea16f120b9ba784129c23f075b"} Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.223981 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.237881 4835 scope.go:117] "RemoveContainer" containerID="330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.238252 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47\": container with ID starting with 330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47 not found: ID does not exist" containerID="330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.238292 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47"} err="failed to get container status \"330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47\": rpc error: code = NotFound desc = could not find container \"330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47\": container with ID starting with 330503d8545283f80f1345e990f71b713530f9094c47f903eccb7063ed640e47 not found: ID does not exist" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.238320 4835 scope.go:117] "RemoveContainer" containerID="2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.248954 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-config-data\") pod \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249016 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-combined-ca-bundle\") pod \"653132e3-7688-428a-8fcf-1b241941cb39\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249055 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-config-data\") pod \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249071 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-combined-ca-bundle\") pod \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249093 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-combined-ca-bundle\") pod \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249113 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sfmq\" (UniqueName: \"kubernetes.io/projected/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-kube-api-access-9sfmq\") pod \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\" (UID: \"be66b9c7-49c0-4cdc-94d4-bd9c275c3d59\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249137 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-495nr\" (UniqueName: \"kubernetes.io/projected/653132e3-7688-428a-8fcf-1b241941cb39-kube-api-access-495nr\") pod \"653132e3-7688-428a-8fcf-1b241941cb39\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249174 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-combined-ca-bundle\") pod \"a4aa3031-48f9-4215-92cc-d321275d6875\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249199 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-config-data\") pod \"653132e3-7688-428a-8fcf-1b241941cb39\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249245 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqrqf\" (UniqueName: \"kubernetes.io/projected/a4aa3031-48f9-4215-92cc-d321275d6875-kube-api-access-kqrqf\") pod \"a4aa3031-48f9-4215-92cc-d321275d6875\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249264 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653132e3-7688-428a-8fcf-1b241941cb39-logs\") pod \"653132e3-7688-428a-8fcf-1b241941cb39\" (UID: \"653132e3-7688-428a-8fcf-1b241941cb39\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249302 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4aa3031-48f9-4215-92cc-d321275d6875-logs\") pod \"a4aa3031-48f9-4215-92cc-d321275d6875\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249332 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-config-data\") pod \"a4aa3031-48f9-4215-92cc-d321275d6875\" (UID: \"a4aa3031-48f9-4215-92cc-d321275d6875\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.249388 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljhlr\" (UniqueName: \"kubernetes.io/projected/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-kube-api-access-ljhlr\") pod \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\" (UID: \"d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39\") " Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.250836 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4aa3031-48f9-4215-92cc-d321275d6875-logs" (OuterVolumeSpecName: "logs") pod "a4aa3031-48f9-4215-92cc-d321275d6875" (UID: "a4aa3031-48f9-4215-92cc-d321275d6875"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.251150 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/653132e3-7688-428a-8fcf-1b241941cb39-logs" (OuterVolumeSpecName: "logs") pod "653132e3-7688-428a-8fcf-1b241941cb39" (UID: "653132e3-7688-428a-8fcf-1b241941cb39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.256648 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/653132e3-7688-428a-8fcf-1b241941cb39-kube-api-access-495nr" (OuterVolumeSpecName: "kube-api-access-495nr") pod "653132e3-7688-428a-8fcf-1b241941cb39" (UID: "653132e3-7688-428a-8fcf-1b241941cb39"). InnerVolumeSpecName "kube-api-access-495nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.258768 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-kube-api-access-9sfmq" (OuterVolumeSpecName: "kube-api-access-9sfmq") pod "be66b9c7-49c0-4cdc-94d4-bd9c275c3d59" (UID: "be66b9c7-49c0-4cdc-94d4-bd9c275c3d59"). InnerVolumeSpecName "kube-api-access-9sfmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.259912 4835 scope.go:117] "RemoveContainer" containerID="0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.261151 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4aa3031-48f9-4215-92cc-d321275d6875-kube-api-access-kqrqf" (OuterVolumeSpecName: "kube-api-access-kqrqf") pod "a4aa3031-48f9-4215-92cc-d321275d6875" (UID: "a4aa3031-48f9-4215-92cc-d321275d6875"). InnerVolumeSpecName "kube-api-access-kqrqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.261897 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-kube-api-access-ljhlr" (OuterVolumeSpecName: "kube-api-access-ljhlr") pod "d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39" (UID: "d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39"). InnerVolumeSpecName "kube-api-access-ljhlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.280453 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-config-data" (OuterVolumeSpecName: "config-data") pod "653132e3-7688-428a-8fcf-1b241941cb39" (UID: "653132e3-7688-428a-8fcf-1b241941cb39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.280503 4835 scope.go:117] "RemoveContainer" containerID="2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.280649 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be66b9c7-49c0-4cdc-94d4-bd9c275c3d59" (UID: "be66b9c7-49c0-4cdc-94d4-bd9c275c3d59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.281054 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad\": container with ID starting with 2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad not found: ID does not exist" containerID="2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.281089 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad"} err="failed to get container status \"2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad\": rpc error: code = NotFound desc = could not find container \"2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad\": container with ID starting with 2588f1dd2fcd464ddc356cb6efbc81835756bb5f10479d4babfe6eed93c841ad not found: ID does not exist" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.281112 4835 scope.go:117] "RemoveContainer" containerID="0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.281560 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7\": container with ID starting with 0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7 not found: ID does not exist" containerID="0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.281589 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7"} err="failed to get container status \"0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7\": rpc error: code = NotFound desc = could not find container \"0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7\": container with ID starting with 0b6c60fc6bfa109fcffe5cb8245f39500a9ec25f7610de2c63d85bb1574dafd7 not found: ID does not exist" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.281607 4835 scope.go:117] "RemoveContainer" containerID="5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.286424 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39" (UID: "d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.286728 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4aa3031-48f9-4215-92cc-d321275d6875" (UID: "a4aa3031-48f9-4215-92cc-d321275d6875"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.287171 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-config-data" (OuterVolumeSpecName: "config-data") pod "d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39" (UID: "d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.288606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "653132e3-7688-428a-8fcf-1b241941cb39" (UID: "653132e3-7688-428a-8fcf-1b241941cb39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.293499 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-config-data" (OuterVolumeSpecName: "config-data") pod "be66b9c7-49c0-4cdc-94d4-bd9c275c3d59" (UID: "be66b9c7-49c0-4cdc-94d4-bd9c275c3d59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.296433 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-config-data" (OuterVolumeSpecName: "config-data") pod "a4aa3031-48f9-4215-92cc-d321275d6875" (UID: "a4aa3031-48f9-4215-92cc-d321275d6875"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.307852 4835 scope.go:117] "RemoveContainer" containerID="5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.308323 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e\": container with ID starting with 5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e not found: ID does not exist" containerID="5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.308390 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e"} err="failed to get container status \"5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e\": rpc error: code = NotFound desc = could not find container \"5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e\": container with ID starting with 5bcc66dac0b7fce1ce2bd148214d71c4f1ed7ccdd8ed0aa081c034762cdaeb5e not found: ID does not exist" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.308439 4835 scope.go:117] "RemoveContainer" containerID="66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.324559 4835 scope.go:117] "RemoveContainer" containerID="c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.339926 4835 scope.go:117] "RemoveContainer" containerID="66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.340457 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512\": container with ID starting with 66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512 not found: ID does not exist" containerID="66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.340494 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512"} err="failed to get container status \"66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512\": rpc error: code = NotFound desc = could not find container \"66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512\": container with ID starting with 66c262070c4727ed9d4547c7e5d2f8dc0b0c55ee1baeee0795cce7a4431a4512 not found: ID does not exist" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.340517 4835 scope.go:117] "RemoveContainer" containerID="c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.340727 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb\": container with ID starting with c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb not found: ID does not exist" containerID="c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.340776 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb"} err="failed to get container status \"c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb\": rpc error: code = NotFound desc = could not find container \"c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb\": container with ID starting with c7b83ffec4ee3829bcb57beb9b1ccb7ba819c462a820ee87ceafeea4352477fb not found: ID does not exist" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351341 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqrqf\" (UniqueName: \"kubernetes.io/projected/a4aa3031-48f9-4215-92cc-d321275d6875-kube-api-access-kqrqf\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351379 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653132e3-7688-428a-8fcf-1b241941cb39-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351393 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4aa3031-48f9-4215-92cc-d321275d6875-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351404 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351419 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljhlr\" (UniqueName: \"kubernetes.io/projected/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-kube-api-access-ljhlr\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351433 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351443 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351453 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351464 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351474 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351485 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sfmq\" (UniqueName: \"kubernetes.io/projected/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59-kube-api-access-9sfmq\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351496 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-495nr\" (UniqueName: \"kubernetes.io/projected/653132e3-7688-428a-8fcf-1b241941cb39-kube-api-access-495nr\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351506 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4aa3031-48f9-4215-92cc-d321275d6875-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.351514 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653132e3-7688-428a-8fcf-1b241941cb39-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.464547 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.574444 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.579986 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.592017 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.605319 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.617266 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.617728 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.617746 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.617768 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02806da-6104-48e3-8d94-db8475e53b68" containerName="neutron-api" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.617775 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02806da-6104-48e3-8d94-db8475e53b68" containerName="neutron-api" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.617786 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4aa3031-48f9-4215-92cc-d321275d6875" containerName="nova-api-api" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.617793 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4aa3031-48f9-4215-92cc-d321275d6875" containerName="nova-api-api" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.617803 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4aa3031-48f9-4215-92cc-d321275d6875" containerName="nova-api-log" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.617809 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4aa3031-48f9-4215-92cc-d321275d6875" containerName="nova-api-log" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.617816 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653132e3-7688-428a-8fcf-1b241941cb39" containerName="nova-metadata-metadata" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.617822 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="653132e3-7688-428a-8fcf-1b241941cb39" containerName="nova-metadata-metadata" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.617833 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02806da-6104-48e3-8d94-db8475e53b68" containerName="neutron-httpd" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.617838 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02806da-6104-48e3-8d94-db8475e53b68" containerName="neutron-httpd" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.617844 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653132e3-7688-428a-8fcf-1b241941cb39" containerName="nova-metadata-log" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.617850 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="653132e3-7688-428a-8fcf-1b241941cb39" containerName="nova-metadata-log" Oct 02 11:17:07 crc kubenswrapper[4835]: E1002 11:17:07.617866 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be66b9c7-49c0-4cdc-94d4-bd9c275c3d59" containerName="nova-scheduler-scheduler" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.617873 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="be66b9c7-49c0-4cdc-94d4-bd9c275c3d59" containerName="nova-scheduler-scheduler" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.618020 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4aa3031-48f9-4215-92cc-d321275d6875" containerName="nova-api-api" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.618033 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.618050 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4aa3031-48f9-4215-92cc-d321275d6875" containerName="nova-api-log" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.618058 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="be66b9c7-49c0-4cdc-94d4-bd9c275c3d59" containerName="nova-scheduler-scheduler" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.618076 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="653132e3-7688-428a-8fcf-1b241941cb39" containerName="nova-metadata-metadata" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.618088 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02806da-6104-48e3-8d94-db8475e53b68" containerName="neutron-httpd" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.618095 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="653132e3-7688-428a-8fcf-1b241941cb39" containerName="nova-metadata-log" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.618104 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02806da-6104-48e3-8d94-db8475e53b68" containerName="neutron-api" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.619208 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.621397 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.629970 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.641634 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.649517 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.661386 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.664317 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.669674 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.669808 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.673521 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.709478 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.716107 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.717728 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.726492 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.729053 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.734396 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.754414 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.757400 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.761115 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.761398 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.761409 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.770245 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.770836 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.770911 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-config-data\") pod \"nova-scheduler-0\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.771054 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz8lr\" (UniqueName: \"kubernetes.io/projected/3640b038-8550-4290-b01b-43858263b442-kube-api-access-nz8lr\") pod \"nova-scheduler-0\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.771148 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd65w\" (UniqueName: \"kubernetes.io/projected/12ca9d70-c91b-4f22-bc37-f768c874eb4c-kube-api-access-zd65w\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.771193 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.771253 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.771383 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ca9d70-c91b-4f22-bc37-f768c874eb4c-logs\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.771420 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-config-data\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.873741 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.873815 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.873870 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.873895 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk4cw\" (UniqueName: \"kubernetes.io/projected/cdee82f4-8419-4cf1-8910-9c5516070f11-kube-api-access-vk4cw\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.873931 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.873956 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ca9d70-c91b-4f22-bc37-f768c874eb4c-logs\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.873973 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-config-data\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.874009 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-config-data\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.874050 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw997\" (UniqueName: \"kubernetes.io/projected/3ad9af28-1781-46aa-ba72-995525b67008-kube-api-access-mw997\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.874067 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.874082 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad9af28-1781-46aa-ba72-995525b67008-logs\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.874098 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.874114 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-config-data\") pod \"nova-scheduler-0\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.874153 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.874175 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.874200 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz8lr\" (UniqueName: \"kubernetes.io/projected/3640b038-8550-4290-b01b-43858263b442-kube-api-access-nz8lr\") pod \"nova-scheduler-0\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.874248 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd65w\" (UniqueName: \"kubernetes.io/projected/12ca9d70-c91b-4f22-bc37-f768c874eb4c-kube-api-access-zd65w\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.875082 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ca9d70-c91b-4f22-bc37-f768c874eb4c-logs\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.879493 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.885823 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.886361 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-config-data\") pod \"nova-scheduler-0\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.892650 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-config-data\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.892710 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.893102 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz8lr\" (UniqueName: \"kubernetes.io/projected/3640b038-8550-4290-b01b-43858263b442-kube-api-access-nz8lr\") pod \"nova-scheduler-0\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.894412 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd65w\" (UniqueName: \"kubernetes.io/projected/12ca9d70-c91b-4f22-bc37-f768c874eb4c-kube-api-access-zd65w\") pod \"nova-metadata-0\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.942821 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.976043 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-config-data\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.976501 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw997\" (UniqueName: \"kubernetes.io/projected/3ad9af28-1781-46aa-ba72-995525b67008-kube-api-access-mw997\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.976716 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad9af28-1781-46aa-ba72-995525b67008-logs\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.976857 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.977035 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.977383 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.977599 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.978103 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk4cw\" (UniqueName: \"kubernetes.io/projected/cdee82f4-8419-4cf1-8910-9c5516070f11-kube-api-access-vk4cw\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.978274 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.977472 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad9af28-1781-46aa-ba72-995525b67008-logs\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.982402 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.984350 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-config-data\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.984733 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.985004 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.986724 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.986900 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdee82f4-8419-4cf1-8910-9c5516070f11-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.988924 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.997183 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk4cw\" (UniqueName: \"kubernetes.io/projected/cdee82f4-8419-4cf1-8910-9c5516070f11-kube-api-access-vk4cw\") pod \"nova-cell1-novncproxy-0\" (UID: \"cdee82f4-8419-4cf1-8910-9c5516070f11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:07 crc kubenswrapper[4835]: I1002 11:17:07.997662 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw997\" (UniqueName: \"kubernetes.io/projected/3ad9af28-1781-46aa-ba72-995525b67008-kube-api-access-mw997\") pod \"nova-api-0\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " pod="openstack/nova-api-0" Oct 02 11:17:08 crc kubenswrapper[4835]: I1002 11:17:08.107414 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:08 crc kubenswrapper[4835]: I1002 11:17:08.113960 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:08 crc kubenswrapper[4835]: I1002 11:17:08.271435 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="653132e3-7688-428a-8fcf-1b241941cb39" path="/var/lib/kubelet/pods/653132e3-7688-428a-8fcf-1b241941cb39/volumes" Oct 02 11:17:08 crc kubenswrapper[4835]: I1002 11:17:08.272585 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4aa3031-48f9-4215-92cc-d321275d6875" path="/var/lib/kubelet/pods/a4aa3031-48f9-4215-92cc-d321275d6875/volumes" Oct 02 11:17:08 crc kubenswrapper[4835]: I1002 11:17:08.273210 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be66b9c7-49c0-4cdc-94d4-bd9c275c3d59" path="/var/lib/kubelet/pods/be66b9c7-49c0-4cdc-94d4-bd9c275c3d59/volumes" Oct 02 11:17:08 crc kubenswrapper[4835]: I1002 11:17:08.274168 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39" path="/var/lib/kubelet/pods/d9d469c8-4b35-4cd0-9aa1-5a2b45ba7c39/volumes" Oct 02 11:17:08 crc kubenswrapper[4835]: I1002 11:17:08.414554 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:08 crc kubenswrapper[4835]: W1002 11:17:08.419073 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3640b038_8550_4290_b01b_43858263b442.slice/crio-5fee7a9022c3436c58e3da100b292f276648ce0379683e38872dea9abacbf1fc WatchSource:0}: Error finding container 5fee7a9022c3436c58e3da100b292f276648ce0379683e38872dea9abacbf1fc: Status 404 returned error can't find the container with id 5fee7a9022c3436c58e3da100b292f276648ce0379683e38872dea9abacbf1fc Oct 02 11:17:08 crc kubenswrapper[4835]: I1002 11:17:08.484077 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:08 crc kubenswrapper[4835]: I1002 11:17:08.613343 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:17:08 crc kubenswrapper[4835]: W1002 11:17:08.618934 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ad9af28_1781_46aa_ba72_995525b67008.slice/crio-abc7769718f0f16aa6c806846fda5d08ad45bfb8b9b44ecc4e49d9705b3ad7d6 WatchSource:0}: Error finding container abc7769718f0f16aa6c806846fda5d08ad45bfb8b9b44ecc4e49d9705b3ad7d6: Status 404 returned error can't find the container with id abc7769718f0f16aa6c806846fda5d08ad45bfb8b9b44ecc4e49d9705b3ad7d6 Oct 02 11:17:08 crc kubenswrapper[4835]: W1002 11:17:08.621083 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdee82f4_8419_4cf1_8910_9c5516070f11.slice/crio-d6039188ed3031e5328bf2eb846b22ee23901f6bb80ebfc89cb64139544987c8 WatchSource:0}: Error finding container d6039188ed3031e5328bf2eb846b22ee23901f6bb80ebfc89cb64139544987c8: Status 404 returned error can't find the container with id d6039188ed3031e5328bf2eb846b22ee23901f6bb80ebfc89cb64139544987c8 Oct 02 11:17:08 crc kubenswrapper[4835]: I1002 11:17:08.621945 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.264564 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cdee82f4-8419-4cf1-8910-9c5516070f11","Type":"ContainerStarted","Data":"074ba3f523bc839421210fa243234ffbb1cc97e07b4269c7c8b22d981fae580c"} Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.264917 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cdee82f4-8419-4cf1-8910-9c5516070f11","Type":"ContainerStarted","Data":"d6039188ed3031e5328bf2eb846b22ee23901f6bb80ebfc89cb64139544987c8"} Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.267605 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3ad9af28-1781-46aa-ba72-995525b67008","Type":"ContainerStarted","Data":"217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265"} Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.267666 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3ad9af28-1781-46aa-ba72-995525b67008","Type":"ContainerStarted","Data":"2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8"} Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.267689 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3ad9af28-1781-46aa-ba72-995525b67008","Type":"ContainerStarted","Data":"abc7769718f0f16aa6c806846fda5d08ad45bfb8b9b44ecc4e49d9705b3ad7d6"} Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.272539 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12ca9d70-c91b-4f22-bc37-f768c874eb4c","Type":"ContainerStarted","Data":"46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e"} Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.272604 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12ca9d70-c91b-4f22-bc37-f768c874eb4c","Type":"ContainerStarted","Data":"b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba"} Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.272615 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12ca9d70-c91b-4f22-bc37-f768c874eb4c","Type":"ContainerStarted","Data":"882e282e4e589496f39fef6e6e7a5e9e41535a43e0d558ee51e1dfd788903ed6"} Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.278334 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3640b038-8550-4290-b01b-43858263b442","Type":"ContainerStarted","Data":"6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6"} Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.278402 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3640b038-8550-4290-b01b-43858263b442","Type":"ContainerStarted","Data":"5fee7a9022c3436c58e3da100b292f276648ce0379683e38872dea9abacbf1fc"} Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.291023 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.290994688 podStartE2EDuration="2.290994688s" podCreationTimestamp="2025-10-02 11:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:09.284517691 +0000 UTC m=+1305.844425272" watchObservedRunningTime="2025-10-02 11:17:09.290994688 +0000 UTC m=+1305.850902269" Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.309437 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.30941152 podStartE2EDuration="2.30941152s" podCreationTimestamp="2025-10-02 11:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:09.302993354 +0000 UTC m=+1305.862900945" watchObservedRunningTime="2025-10-02 11:17:09.30941152 +0000 UTC m=+1305.869319101" Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.361320 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.361297778 podStartE2EDuration="2.361297778s" podCreationTimestamp="2025-10-02 11:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:09.325593727 +0000 UTC m=+1305.885501308" watchObservedRunningTime="2025-10-02 11:17:09.361297778 +0000 UTC m=+1305.921205359" Oct 02 11:17:09 crc kubenswrapper[4835]: I1002 11:17:09.370339 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.370318549 podStartE2EDuration="2.370318549s" podCreationTimestamp="2025-10-02 11:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:09.350485396 +0000 UTC m=+1305.910392977" watchObservedRunningTime="2025-10-02 11:17:09.370318549 +0000 UTC m=+1305.930226130" Oct 02 11:17:12 crc kubenswrapper[4835]: I1002 11:17:12.943308 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:17:12 crc kubenswrapper[4835]: I1002 11:17:12.989768 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:17:12 crc kubenswrapper[4835]: I1002 11:17:12.989839 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:17:13 crc kubenswrapper[4835]: I1002 11:17:13.115006 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:17 crc kubenswrapper[4835]: I1002 11:17:17.943918 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 11:17:17 crc kubenswrapper[4835]: I1002 11:17:17.971330 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 11:17:17 crc kubenswrapper[4835]: I1002 11:17:17.990140 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:17:17 crc kubenswrapper[4835]: I1002 11:17:17.990185 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.120412 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.120504 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.121450 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.145381 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.393666 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.404109 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.598669 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5lchq"] Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.600128 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.602683 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.603119 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.609579 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5lchq"] Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.698066 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.698157 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-config-data\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.698194 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbwls\" (UniqueName: \"kubernetes.io/projected/b0793397-2746-4cf7-9285-15ae5d86ffd2-kube-api-access-vbwls\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.698315 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-scripts\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.800896 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-scripts\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.801734 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.801770 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-config-data\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.801805 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbwls\" (UniqueName: \"kubernetes.io/projected/b0793397-2746-4cf7-9285-15ae5d86ffd2-kube-api-access-vbwls\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.809552 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.821512 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-config-data\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.821979 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-scripts\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.823887 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbwls\" (UniqueName: \"kubernetes.io/projected/b0793397-2746-4cf7-9285-15ae5d86ffd2-kube-api-access-vbwls\") pod \"nova-cell1-cell-mapping-5lchq\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:18 crc kubenswrapper[4835]: I1002 11:17:18.939488 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:19 crc kubenswrapper[4835]: I1002 11:17:19.007394 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.181:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:19 crc kubenswrapper[4835]: I1002 11:17:19.007796 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.181:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:19 crc kubenswrapper[4835]: I1002 11:17:19.205639 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3ad9af28-1781-46aa-ba72-995525b67008" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:19 crc kubenswrapper[4835]: I1002 11:17:19.205698 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3ad9af28-1781-46aa-ba72-995525b67008" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:19 crc kubenswrapper[4835]: W1002 11:17:19.410726 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0793397_2746_4cf7_9285_15ae5d86ffd2.slice/crio-2da84efb72e33ca3ad44d819d4494347bc133c323838579fc76265976bc052aa WatchSource:0}: Error finding container 2da84efb72e33ca3ad44d819d4494347bc133c323838579fc76265976bc052aa: Status 404 returned error can't find the container with id 2da84efb72e33ca3ad44d819d4494347bc133c323838579fc76265976bc052aa Oct 02 11:17:19 crc kubenswrapper[4835]: I1002 11:17:19.418185 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5lchq"] Oct 02 11:17:20 crc kubenswrapper[4835]: I1002 11:17:20.398806 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5lchq" event={"ID":"b0793397-2746-4cf7-9285-15ae5d86ffd2","Type":"ContainerStarted","Data":"84ea7e736a7ae556aab418997888bcb0aca4aa93feeb24a6a1da9ae4c078a246"} Oct 02 11:17:20 crc kubenswrapper[4835]: I1002 11:17:20.399145 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5lchq" event={"ID":"b0793397-2746-4cf7-9285-15ae5d86ffd2","Type":"ContainerStarted","Data":"2da84efb72e33ca3ad44d819d4494347bc133c323838579fc76265976bc052aa"} Oct 02 11:17:20 crc kubenswrapper[4835]: I1002 11:17:20.420866 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5lchq" podStartSLOduration=2.420844383 podStartE2EDuration="2.420844383s" podCreationTimestamp="2025-10-02 11:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:20.417208218 +0000 UTC m=+1316.977115799" watchObservedRunningTime="2025-10-02 11:17:20.420844383 +0000 UTC m=+1316.980751964" Oct 02 11:17:25 crc kubenswrapper[4835]: I1002 11:17:25.485351 4835 generic.go:334] "Generic (PLEG): container finished" podID="b0793397-2746-4cf7-9285-15ae5d86ffd2" containerID="84ea7e736a7ae556aab418997888bcb0aca4aa93feeb24a6a1da9ae4c078a246" exitCode=0 Oct 02 11:17:25 crc kubenswrapper[4835]: I1002 11:17:25.485411 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5lchq" event={"ID":"b0793397-2746-4cf7-9285-15ae5d86ffd2","Type":"ContainerDied","Data":"84ea7e736a7ae556aab418997888bcb0aca4aa93feeb24a6a1da9ae4c078a246"} Oct 02 11:17:26 crc kubenswrapper[4835]: I1002 11:17:26.841791 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:26 crc kubenswrapper[4835]: I1002 11:17:26.971170 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-combined-ca-bundle\") pod \"b0793397-2746-4cf7-9285-15ae5d86ffd2\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " Oct 02 11:17:26 crc kubenswrapper[4835]: I1002 11:17:26.971430 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbwls\" (UniqueName: \"kubernetes.io/projected/b0793397-2746-4cf7-9285-15ae5d86ffd2-kube-api-access-vbwls\") pod \"b0793397-2746-4cf7-9285-15ae5d86ffd2\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " Oct 02 11:17:26 crc kubenswrapper[4835]: I1002 11:17:26.971490 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-scripts\") pod \"b0793397-2746-4cf7-9285-15ae5d86ffd2\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " Oct 02 11:17:26 crc kubenswrapper[4835]: I1002 11:17:26.971524 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-config-data\") pod \"b0793397-2746-4cf7-9285-15ae5d86ffd2\" (UID: \"b0793397-2746-4cf7-9285-15ae5d86ffd2\") " Oct 02 11:17:26 crc kubenswrapper[4835]: I1002 11:17:26.977304 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-scripts" (OuterVolumeSpecName: "scripts") pod "b0793397-2746-4cf7-9285-15ae5d86ffd2" (UID: "b0793397-2746-4cf7-9285-15ae5d86ffd2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:26 crc kubenswrapper[4835]: I1002 11:17:26.978400 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0793397-2746-4cf7-9285-15ae5d86ffd2-kube-api-access-vbwls" (OuterVolumeSpecName: "kube-api-access-vbwls") pod "b0793397-2746-4cf7-9285-15ae5d86ffd2" (UID: "b0793397-2746-4cf7-9285-15ae5d86ffd2"). InnerVolumeSpecName "kube-api-access-vbwls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:26 crc kubenswrapper[4835]: I1002 11:17:26.998333 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0793397-2746-4cf7-9285-15ae5d86ffd2" (UID: "b0793397-2746-4cf7-9285-15ae5d86ffd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:26 crc kubenswrapper[4835]: I1002 11:17:26.998879 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-config-data" (OuterVolumeSpecName: "config-data") pod "b0793397-2746-4cf7-9285-15ae5d86ffd2" (UID: "b0793397-2746-4cf7-9285-15ae5d86ffd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.074654 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.074695 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbwls\" (UniqueName: \"kubernetes.io/projected/b0793397-2746-4cf7-9285-15ae5d86ffd2-kube-api-access-vbwls\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.074710 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.074721 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0793397-2746-4cf7-9285-15ae5d86ffd2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.506024 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5lchq" event={"ID":"b0793397-2746-4cf7-9285-15ae5d86ffd2","Type":"ContainerDied","Data":"2da84efb72e33ca3ad44d819d4494347bc133c323838579fc76265976bc052aa"} Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.506372 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2da84efb72e33ca3ad44d819d4494347bc133c323838579fc76265976bc052aa" Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.506120 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5lchq" Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.698016 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.698381 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3ad9af28-1781-46aa-ba72-995525b67008" containerName="nova-api-log" containerID="cri-o://2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8" gracePeriod=30 Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.698549 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3ad9af28-1781-46aa-ba72-995525b67008" containerName="nova-api-api" containerID="cri-o://217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265" gracePeriod=30 Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.711551 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.711962 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3640b038-8550-4290-b01b-43858263b442" containerName="nova-scheduler-scheduler" containerID="cri-o://6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6" gracePeriod=30 Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.723311 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.723640 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerName="nova-metadata-log" containerID="cri-o://b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba" gracePeriod=30 Oct 02 11:17:27 crc kubenswrapper[4835]: I1002 11:17:27.723727 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerName="nova-metadata-metadata" containerID="cri-o://46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e" gracePeriod=30 Oct 02 11:17:27 crc kubenswrapper[4835]: E1002 11:17:27.945467 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:17:27 crc kubenswrapper[4835]: E1002 11:17:27.947594 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:17:27 crc kubenswrapper[4835]: E1002 11:17:27.949119 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:17:27 crc kubenswrapper[4835]: E1002 11:17:27.949212 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3640b038-8550-4290-b01b-43858263b442" containerName="nova-scheduler-scheduler" Oct 02 11:17:28 crc kubenswrapper[4835]: I1002 11:17:28.515740 4835 generic.go:334] "Generic (PLEG): container finished" podID="3ad9af28-1781-46aa-ba72-995525b67008" containerID="2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8" exitCode=143 Oct 02 11:17:28 crc kubenswrapper[4835]: I1002 11:17:28.515816 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3ad9af28-1781-46aa-ba72-995525b67008","Type":"ContainerDied","Data":"2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8"} Oct 02 11:17:28 crc kubenswrapper[4835]: I1002 11:17:28.517487 4835 generic.go:334] "Generic (PLEG): container finished" podID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerID="b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba" exitCode=143 Oct 02 11:17:28 crc kubenswrapper[4835]: I1002 11:17:28.517514 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12ca9d70-c91b-4f22-bc37-f768c874eb4c","Type":"ContainerDied","Data":"b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba"} Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.377277 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.457789 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz8lr\" (UniqueName: \"kubernetes.io/projected/3640b038-8550-4290-b01b-43858263b442-kube-api-access-nz8lr\") pod \"3640b038-8550-4290-b01b-43858263b442\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.458737 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-config-data\") pod \"3640b038-8550-4290-b01b-43858263b442\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.460829 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-combined-ca-bundle\") pod \"3640b038-8550-4290-b01b-43858263b442\" (UID: \"3640b038-8550-4290-b01b-43858263b442\") " Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.464194 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3640b038-8550-4290-b01b-43858263b442-kube-api-access-nz8lr" (OuterVolumeSpecName: "kube-api-access-nz8lr") pod "3640b038-8550-4290-b01b-43858263b442" (UID: "3640b038-8550-4290-b01b-43858263b442"). InnerVolumeSpecName "kube-api-access-nz8lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.489306 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3640b038-8550-4290-b01b-43858263b442" (UID: "3640b038-8550-4290-b01b-43858263b442"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.495470 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-config-data" (OuterVolumeSpecName: "config-data") pod "3640b038-8550-4290-b01b-43858263b442" (UID: "3640b038-8550-4290-b01b-43858263b442"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.539590 4835 generic.go:334] "Generic (PLEG): container finished" podID="3640b038-8550-4290-b01b-43858263b442" containerID="6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6" exitCode=0 Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.539642 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3640b038-8550-4290-b01b-43858263b442","Type":"ContainerDied","Data":"6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6"} Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.539670 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3640b038-8550-4290-b01b-43858263b442","Type":"ContainerDied","Data":"5fee7a9022c3436c58e3da100b292f276648ce0379683e38872dea9abacbf1fc"} Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.539688 4835 scope.go:117] "RemoveContainer" containerID="6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.539857 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.563635 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.563666 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3640b038-8550-4290-b01b-43858263b442-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.563678 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz8lr\" (UniqueName: \"kubernetes.io/projected/3640b038-8550-4290-b01b-43858263b442-kube-api-access-nz8lr\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.579581 4835 scope.go:117] "RemoveContainer" containerID="6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6" Oct 02 11:17:30 crc kubenswrapper[4835]: E1002 11:17:30.580048 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6\": container with ID starting with 6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6 not found: ID does not exist" containerID="6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.580108 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6"} err="failed to get container status \"6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6\": rpc error: code = NotFound desc = could not find container \"6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6\": container with ID starting with 6c62fc37902560c6a44ba4264bbd6df987e3ac6cbd38fa275b265842bfe1b2c6 not found: ID does not exist" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.586007 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.619924 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.624172 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:30 crc kubenswrapper[4835]: E1002 11:17:30.625213 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0793397-2746-4cf7-9285-15ae5d86ffd2" containerName="nova-manage" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.625253 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0793397-2746-4cf7-9285-15ae5d86ffd2" containerName="nova-manage" Oct 02 11:17:30 crc kubenswrapper[4835]: E1002 11:17:30.625293 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3640b038-8550-4290-b01b-43858263b442" containerName="nova-scheduler-scheduler" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.625300 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3640b038-8550-4290-b01b-43858263b442" containerName="nova-scheduler-scheduler" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.625499 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0793397-2746-4cf7-9285-15ae5d86ffd2" containerName="nova-manage" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.625513 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3640b038-8550-4290-b01b-43858263b442" containerName="nova-scheduler-scheduler" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.626311 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.633836 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.639084 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.766658 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcpxj\" (UniqueName: \"kubernetes.io/projected/4d7698ba-7a87-4210-9d11-8bb99f997178-kube-api-access-xcpxj\") pod \"nova-scheduler-0\" (UID: \"4d7698ba-7a87-4210-9d11-8bb99f997178\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.766732 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7698ba-7a87-4210-9d11-8bb99f997178-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d7698ba-7a87-4210-9d11-8bb99f997178\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.766813 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d7698ba-7a87-4210-9d11-8bb99f997178-config-data\") pod \"nova-scheduler-0\" (UID: \"4d7698ba-7a87-4210-9d11-8bb99f997178\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.868839 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcpxj\" (UniqueName: \"kubernetes.io/projected/4d7698ba-7a87-4210-9d11-8bb99f997178-kube-api-access-xcpxj\") pod \"nova-scheduler-0\" (UID: \"4d7698ba-7a87-4210-9d11-8bb99f997178\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.868896 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7698ba-7a87-4210-9d11-8bb99f997178-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d7698ba-7a87-4210-9d11-8bb99f997178\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.868947 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d7698ba-7a87-4210-9d11-8bb99f997178-config-data\") pod \"nova-scheduler-0\" (UID: \"4d7698ba-7a87-4210-9d11-8bb99f997178\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.874569 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d7698ba-7a87-4210-9d11-8bb99f997178-config-data\") pod \"nova-scheduler-0\" (UID: \"4d7698ba-7a87-4210-9d11-8bb99f997178\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.876860 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7698ba-7a87-4210-9d11-8bb99f997178-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d7698ba-7a87-4210-9d11-8bb99f997178\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.889793 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcpxj\" (UniqueName: \"kubernetes.io/projected/4d7698ba-7a87-4210-9d11-8bb99f997178-kube-api-access-xcpxj\") pod \"nova-scheduler-0\" (UID: \"4d7698ba-7a87-4210-9d11-8bb99f997178\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:30 crc kubenswrapper[4835]: I1002 11:17:30.955690 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.360306 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.366654 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.485010 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-combined-ca-bundle\") pod \"3ad9af28-1781-46aa-ba72-995525b67008\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.485125 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd65w\" (UniqueName: \"kubernetes.io/projected/12ca9d70-c91b-4f22-bc37-f768c874eb4c-kube-api-access-zd65w\") pod \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.485181 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-nova-metadata-tls-certs\") pod \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.485279 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw997\" (UniqueName: \"kubernetes.io/projected/3ad9af28-1781-46aa-ba72-995525b67008-kube-api-access-mw997\") pod \"3ad9af28-1781-46aa-ba72-995525b67008\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.485388 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad9af28-1781-46aa-ba72-995525b67008-logs\") pod \"3ad9af28-1781-46aa-ba72-995525b67008\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.485445 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ca9d70-c91b-4f22-bc37-f768c874eb4c-logs\") pod \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.485494 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-config-data\") pod \"3ad9af28-1781-46aa-ba72-995525b67008\" (UID: \"3ad9af28-1781-46aa-ba72-995525b67008\") " Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.485557 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-combined-ca-bundle\") pod \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.485609 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-config-data\") pod \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\" (UID: \"12ca9d70-c91b-4f22-bc37-f768c874eb4c\") " Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.488503 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad9af28-1781-46aa-ba72-995525b67008-logs" (OuterVolumeSpecName: "logs") pod "3ad9af28-1781-46aa-ba72-995525b67008" (UID: "3ad9af28-1781-46aa-ba72-995525b67008"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.489024 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ca9d70-c91b-4f22-bc37-f768c874eb4c-logs" (OuterVolumeSpecName: "logs") pod "12ca9d70-c91b-4f22-bc37-f768c874eb4c" (UID: "12ca9d70-c91b-4f22-bc37-f768c874eb4c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.493575 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad9af28-1781-46aa-ba72-995525b67008-kube-api-access-mw997" (OuterVolumeSpecName: "kube-api-access-mw997") pod "3ad9af28-1781-46aa-ba72-995525b67008" (UID: "3ad9af28-1781-46aa-ba72-995525b67008"). InnerVolumeSpecName "kube-api-access-mw997". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.499311 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ca9d70-c91b-4f22-bc37-f768c874eb4c-kube-api-access-zd65w" (OuterVolumeSpecName: "kube-api-access-zd65w") pod "12ca9d70-c91b-4f22-bc37-f768c874eb4c" (UID: "12ca9d70-c91b-4f22-bc37-f768c874eb4c"). InnerVolumeSpecName "kube-api-access-zd65w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.501957 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.522584 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ad9af28-1781-46aa-ba72-995525b67008" (UID: "3ad9af28-1781-46aa-ba72-995525b67008"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.524061 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-config-data" (OuterVolumeSpecName: "config-data") pod "12ca9d70-c91b-4f22-bc37-f768c874eb4c" (UID: "12ca9d70-c91b-4f22-bc37-f768c874eb4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:31 crc kubenswrapper[4835]: W1002 11:17:31.529118 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d7698ba_7a87_4210_9d11_8bb99f997178.slice/crio-28375f4a0b3a8df11afd096d2e062b3b29ae9461443e331513f9b25e42383795 WatchSource:0}: Error finding container 28375f4a0b3a8df11afd096d2e062b3b29ae9461443e331513f9b25e42383795: Status 404 returned error can't find the container with id 28375f4a0b3a8df11afd096d2e062b3b29ae9461443e331513f9b25e42383795 Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.530826 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-config-data" (OuterVolumeSpecName: "config-data") pod "3ad9af28-1781-46aa-ba72-995525b67008" (UID: "3ad9af28-1781-46aa-ba72-995525b67008"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.534151 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12ca9d70-c91b-4f22-bc37-f768c874eb4c" (UID: "12ca9d70-c91b-4f22-bc37-f768c874eb4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.551074 4835 generic.go:334] "Generic (PLEG): container finished" podID="3ad9af28-1781-46aa-ba72-995525b67008" containerID="217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265" exitCode=0 Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.551141 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.551141 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3ad9af28-1781-46aa-ba72-995525b67008","Type":"ContainerDied","Data":"217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265"} Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.551259 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3ad9af28-1781-46aa-ba72-995525b67008","Type":"ContainerDied","Data":"abc7769718f0f16aa6c806846fda5d08ad45bfb8b9b44ecc4e49d9705b3ad7d6"} Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.551287 4835 scope.go:117] "RemoveContainer" containerID="217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.556119 4835 generic.go:334] "Generic (PLEG): container finished" podID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerID="46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e" exitCode=0 Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.556565 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.556430 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12ca9d70-c91b-4f22-bc37-f768c874eb4c","Type":"ContainerDied","Data":"46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e"} Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.556796 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12ca9d70-c91b-4f22-bc37-f768c874eb4c","Type":"ContainerDied","Data":"882e282e4e589496f39fef6e6e7a5e9e41535a43e0d558ee51e1dfd788903ed6"} Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.562749 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d7698ba-7a87-4210-9d11-8bb99f997178","Type":"ContainerStarted","Data":"28375f4a0b3a8df11afd096d2e062b3b29ae9461443e331513f9b25e42383795"} Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.564955 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "12ca9d70-c91b-4f22-bc37-f768c874eb4c" (UID: "12ca9d70-c91b-4f22-bc37-f768c874eb4c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.578086 4835 scope.go:117] "RemoveContainer" containerID="2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.590545 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad9af28-1781-46aa-ba72-995525b67008-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.590578 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ca9d70-c91b-4f22-bc37-f768c874eb4c-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.590590 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.590606 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.590619 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.590629 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad9af28-1781-46aa-ba72-995525b67008-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.590640 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd65w\" (UniqueName: \"kubernetes.io/projected/12ca9d70-c91b-4f22-bc37-f768c874eb4c-kube-api-access-zd65w\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.590652 4835 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ca9d70-c91b-4f22-bc37-f768c874eb4c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.590663 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw997\" (UniqueName: \"kubernetes.io/projected/3ad9af28-1781-46aa-ba72-995525b67008-kube-api-access-mw997\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.599771 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.613643 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.627371 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.628965 4835 scope.go:117] "RemoveContainer" containerID="217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265" Oct 02 11:17:31 crc kubenswrapper[4835]: E1002 11:17:31.629523 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265\": container with ID starting with 217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265 not found: ID does not exist" containerID="217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.629578 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265"} err="failed to get container status \"217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265\": rpc error: code = NotFound desc = could not find container \"217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265\": container with ID starting with 217e921592c5a2fade8b687fc4a3de5e5fba85e8460bccfe04323725d9542265 not found: ID does not exist" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.629615 4835 scope.go:117] "RemoveContainer" containerID="2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8" Oct 02 11:17:31 crc kubenswrapper[4835]: E1002 11:17:31.629946 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8\": container with ID starting with 2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8 not found: ID does not exist" containerID="2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.629981 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8"} err="failed to get container status \"2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8\": rpc error: code = NotFound desc = could not find container \"2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8\": container with ID starting with 2d70af6fb312266444c5a8702ea28798fed4922146d9149904c0883320cb32a8 not found: ID does not exist" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.630006 4835 scope.go:117] "RemoveContainer" containerID="46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e" Oct 02 11:17:31 crc kubenswrapper[4835]: E1002 11:17:31.630147 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerName="nova-metadata-log" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.630252 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerName="nova-metadata-log" Oct 02 11:17:31 crc kubenswrapper[4835]: E1002 11:17:31.630320 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad9af28-1781-46aa-ba72-995525b67008" containerName="nova-api-api" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.630378 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad9af28-1781-46aa-ba72-995525b67008" containerName="nova-api-api" Oct 02 11:17:31 crc kubenswrapper[4835]: E1002 11:17:31.630450 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad9af28-1781-46aa-ba72-995525b67008" containerName="nova-api-log" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.630532 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad9af28-1781-46aa-ba72-995525b67008" containerName="nova-api-log" Oct 02 11:17:31 crc kubenswrapper[4835]: E1002 11:17:31.630638 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerName="nova-metadata-metadata" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.630711 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerName="nova-metadata-metadata" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.631050 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad9af28-1781-46aa-ba72-995525b67008" containerName="nova-api-api" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.631141 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerName="nova-metadata-metadata" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.631257 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad9af28-1781-46aa-ba72-995525b67008" containerName="nova-api-log" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.631406 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" containerName="nova-metadata-log" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.632983 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.638722 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.640067 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.662839 4835 scope.go:117] "RemoveContainer" containerID="b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.681956 4835 scope.go:117] "RemoveContainer" containerID="46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e" Oct 02 11:17:31 crc kubenswrapper[4835]: E1002 11:17:31.682851 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e\": container with ID starting with 46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e not found: ID does not exist" containerID="46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.682930 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e"} err="failed to get container status \"46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e\": rpc error: code = NotFound desc = could not find container \"46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e\": container with ID starting with 46a15bde3e68860f124aba0a0b0b077b6d7e6fb258a530bfe0760fb5ff94cb2e not found: ID does not exist" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.682966 4835 scope.go:117] "RemoveContainer" containerID="b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba" Oct 02 11:17:31 crc kubenswrapper[4835]: E1002 11:17:31.683919 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba\": container with ID starting with b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba not found: ID does not exist" containerID="b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.683941 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba"} err="failed to get container status \"b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba\": rpc error: code = NotFound desc = could not find container \"b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba\": container with ID starting with b3cb2ba984bc5622c60431c5633866ca9d7a8644def8164ec8cc7f4e9b7be5ba not found: ID does not exist" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.794468 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-logs\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.794847 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.795150 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdn8v\" (UniqueName: \"kubernetes.io/projected/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-kube-api-access-cdn8v\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.795297 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-config-data\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.890273 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.897689 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-logs\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.897750 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.897820 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdn8v\" (UniqueName: \"kubernetes.io/projected/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-kube-api-access-cdn8v\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.897870 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-config-data\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.898948 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-logs\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.903513 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.904006 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-config-data\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.907372 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.919022 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdn8v\" (UniqueName: \"kubernetes.io/projected/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-kube-api-access-cdn8v\") pod \"nova-api-0\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " pod="openstack/nova-api-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.926993 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.929054 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.931216 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.931444 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.941734 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:31 crc kubenswrapper[4835]: I1002 11:17:31.972883 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.000156 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-logs\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.000262 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-config-data\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.000332 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.000388 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.000593 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlsnb\" (UniqueName: \"kubernetes.io/projected/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-kube-api-access-mlsnb\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.103053 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-logs\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.103398 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-config-data\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.103445 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.103474 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.103531 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlsnb\" (UniqueName: \"kubernetes.io/projected/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-kube-api-access-mlsnb\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.103661 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-logs\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.109423 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.109901 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.112173 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-config-data\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.125050 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlsnb\" (UniqueName: \"kubernetes.io/projected/ebbc364a-1b3f-40ab-b214-e6b301ae4c1e-kube-api-access-mlsnb\") pod \"nova-metadata-0\" (UID: \"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e\") " pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.249335 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.264056 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ca9d70-c91b-4f22-bc37-f768c874eb4c" path="/var/lib/kubelet/pods/12ca9d70-c91b-4f22-bc37-f768c874eb4c/volumes" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.264897 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3640b038-8550-4290-b01b-43858263b442" path="/var/lib/kubelet/pods/3640b038-8550-4290-b01b-43858263b442/volumes" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.265426 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad9af28-1781-46aa-ba72-995525b67008" path="/var/lib/kubelet/pods/3ad9af28-1781-46aa-ba72-995525b67008/volumes" Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.409199 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.492825 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:32 crc kubenswrapper[4835]: W1002 11:17:32.515616 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebbc364a_1b3f_40ab_b214_e6b301ae4c1e.slice/crio-814aaa4766dbc749487c6bfeb78899e2561a0225d573670a149b35f41609d422 WatchSource:0}: Error finding container 814aaa4766dbc749487c6bfeb78899e2561a0225d573670a149b35f41609d422: Status 404 returned error can't find the container with id 814aaa4766dbc749487c6bfeb78899e2561a0225d573670a149b35f41609d422 Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.575950 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e","Type":"ContainerStarted","Data":"814aaa4766dbc749487c6bfeb78899e2561a0225d573670a149b35f41609d422"} Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.577676 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d7698ba-7a87-4210-9d11-8bb99f997178","Type":"ContainerStarted","Data":"04958215a294557ead1208769acfed9972cdddea7634ec67e7fbb7b5fa652720"} Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.578762 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135","Type":"ContainerStarted","Data":"3f2797560da6ce0659736ae72c3a765f5aabcdcf727aefd12d8803226e10282b"} Oct 02 11:17:32 crc kubenswrapper[4835]: I1002 11:17:32.598578 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.598554991 podStartE2EDuration="2.598554991s" podCreationTimestamp="2025-10-02 11:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:32.594656599 +0000 UTC m=+1329.154564200" watchObservedRunningTime="2025-10-02 11:17:32.598554991 +0000 UTC m=+1329.158462582" Oct 02 11:17:33 crc kubenswrapper[4835]: I1002 11:17:33.594856 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e","Type":"ContainerStarted","Data":"931a57503d2baddd9bdbfac054d4b9be92118b29f11b0b352d5566f2d4ca73ad"} Oct 02 11:17:33 crc kubenswrapper[4835]: I1002 11:17:33.595235 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ebbc364a-1b3f-40ab-b214-e6b301ae4c1e","Type":"ContainerStarted","Data":"c71c721076c6e38a98193891881519497ee345ec4504c0cb5f7e29836e6c9f75"} Oct 02 11:17:33 crc kubenswrapper[4835]: I1002 11:17:33.597369 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135","Type":"ContainerStarted","Data":"5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe"} Oct 02 11:17:33 crc kubenswrapper[4835]: I1002 11:17:33.597482 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135","Type":"ContainerStarted","Data":"4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699"} Oct 02 11:17:33 crc kubenswrapper[4835]: I1002 11:17:33.617639 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.617620113 podStartE2EDuration="2.617620113s" podCreationTimestamp="2025-10-02 11:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:33.612881796 +0000 UTC m=+1330.172789377" watchObservedRunningTime="2025-10-02 11:17:33.617620113 +0000 UTC m=+1330.177527694" Oct 02 11:17:33 crc kubenswrapper[4835]: I1002 11:17:33.632914 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.632897034 podStartE2EDuration="2.632897034s" podCreationTimestamp="2025-10-02 11:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:33.631588576 +0000 UTC m=+1330.191496157" watchObservedRunningTime="2025-10-02 11:17:33.632897034 +0000 UTC m=+1330.192804615" Oct 02 11:17:35 crc kubenswrapper[4835]: I1002 11:17:35.956555 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:17:37 crc kubenswrapper[4835]: I1002 11:17:37.249685 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:17:37 crc kubenswrapper[4835]: I1002 11:17:37.249755 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:17:40 crc kubenswrapper[4835]: I1002 11:17:40.956507 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 11:17:40 crc kubenswrapper[4835]: I1002 11:17:40.987596 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 11:17:41 crc kubenswrapper[4835]: I1002 11:17:41.710857 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 11:17:41 crc kubenswrapper[4835]: I1002 11:17:41.973393 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:17:41 crc kubenswrapper[4835]: I1002 11:17:41.973460 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:17:42 crc kubenswrapper[4835]: I1002 11:17:42.249577 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:17:42 crc kubenswrapper[4835]: I1002 11:17:42.249629 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:17:43 crc kubenswrapper[4835]: I1002 11:17:43.014473 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:43 crc kubenswrapper[4835]: I1002 11:17:43.056507 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:43 crc kubenswrapper[4835]: I1002 11:17:43.262435 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ebbc364a-1b3f-40ab-b214-e6b301ae4c1e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.187:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:43 crc kubenswrapper[4835]: I1002 11:17:43.262436 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ebbc364a-1b3f-40ab-b214-e6b301ae4c1e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.187:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:51 crc kubenswrapper[4835]: I1002 11:17:51.977234 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:17:51 crc kubenswrapper[4835]: I1002 11:17:51.978266 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:17:51 crc kubenswrapper[4835]: I1002 11:17:51.978551 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:17:51 crc kubenswrapper[4835]: I1002 11:17:51.982379 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:17:52 crc kubenswrapper[4835]: I1002 11:17:52.275610 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 11:17:52 crc kubenswrapper[4835]: I1002 11:17:52.275723 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 11:17:52 crc kubenswrapper[4835]: I1002 11:17:52.281276 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 11:17:52 crc kubenswrapper[4835]: I1002 11:17:52.284757 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 11:17:52 crc kubenswrapper[4835]: I1002 11:17:52.774586 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:17:52 crc kubenswrapper[4835]: I1002 11:17:52.779476 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:17:52 crc kubenswrapper[4835]: I1002 11:17:52.958627 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mp7rf"] Oct 02 11:17:52 crc kubenswrapper[4835]: I1002 11:17:52.960648 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:52 crc kubenswrapper[4835]: I1002 11:17:52.974504 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mp7rf"] Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.020113 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8k6h\" (UniqueName: \"kubernetes.io/projected/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-kube-api-access-r8k6h\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.020180 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-config\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.020205 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-dns-svc\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.020289 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.020340 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.122175 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.122320 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8k6h\" (UniqueName: \"kubernetes.io/projected/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-kube-api-access-r8k6h\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.122366 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-config\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.122402 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-dns-svc\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.122489 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.123324 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-config\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.123364 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.123600 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-dns-svc\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.123676 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.148008 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8k6h\" (UniqueName: \"kubernetes.io/projected/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-kube-api-access-r8k6h\") pod \"dnsmasq-dns-5b856c5697-mp7rf\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.290579 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:53 crc kubenswrapper[4835]: I1002 11:17:53.780257 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mp7rf"] Oct 02 11:17:54 crc kubenswrapper[4835]: I1002 11:17:54.797049 4835 generic.go:334] "Generic (PLEG): container finished" podID="87ac457e-e0c6-4108-b1f0-4eaae559d4a5" containerID="08d690cf90c98fd3d92b4d679476ed8c43a53dfe606ef2708a5b0b31c98c7685" exitCode=0 Oct 02 11:17:54 crc kubenswrapper[4835]: I1002 11:17:54.797107 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" event={"ID":"87ac457e-e0c6-4108-b1f0-4eaae559d4a5","Type":"ContainerDied","Data":"08d690cf90c98fd3d92b4d679476ed8c43a53dfe606ef2708a5b0b31c98c7685"} Oct 02 11:17:54 crc kubenswrapper[4835]: I1002 11:17:54.797348 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" event={"ID":"87ac457e-e0c6-4108-b1f0-4eaae559d4a5","Type":"ContainerStarted","Data":"90415390f72e5a76481afe4210904baec79efc9e4faf7d68d9d3b21132688495"} Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.242357 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.242728 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="ceilometer-central-agent" containerID="cri-o://15a28b4dada598d6695e48ab3b27a006a69d2c145d5661023700694f531efe80" gracePeriod=30 Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.242781 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="sg-core" containerID="cri-o://b01a12e81a4c19d31d6b531b95cb02b753130af1ae872d5283d8ed399866c9d7" gracePeriod=30 Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.242843 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="ceilometer-notification-agent" containerID="cri-o://8ac6d400448de7d27c3d10a316dcb417bb140ccf5ad576d32452cdd647402c21" gracePeriod=30 Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.242843 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="proxy-httpd" containerID="cri-o://601466600cef5e7769a88427f9f413418f305505ce12141a1bce2b4155289332" gracePeriod=30 Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.824601 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" event={"ID":"87ac457e-e0c6-4108-b1f0-4eaae559d4a5","Type":"ContainerStarted","Data":"07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570"} Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.824731 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.830039 4835 generic.go:334] "Generic (PLEG): container finished" podID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerID="601466600cef5e7769a88427f9f413418f305505ce12141a1bce2b4155289332" exitCode=0 Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.830325 4835 generic.go:334] "Generic (PLEG): container finished" podID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerID="b01a12e81a4c19d31d6b531b95cb02b753130af1ae872d5283d8ed399866c9d7" exitCode=2 Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.830338 4835 generic.go:334] "Generic (PLEG): container finished" podID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerID="15a28b4dada598d6695e48ab3b27a006a69d2c145d5661023700694f531efe80" exitCode=0 Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.830124 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c55deccf-4e69-437c-96a0-ff5f8200acad","Type":"ContainerDied","Data":"601466600cef5e7769a88427f9f413418f305505ce12141a1bce2b4155289332"} Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.830373 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c55deccf-4e69-437c-96a0-ff5f8200acad","Type":"ContainerDied","Data":"b01a12e81a4c19d31d6b531b95cb02b753130af1ae872d5283d8ed399866c9d7"} Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.830387 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c55deccf-4e69-437c-96a0-ff5f8200acad","Type":"ContainerDied","Data":"15a28b4dada598d6695e48ab3b27a006a69d2c145d5661023700694f531efe80"} Oct 02 11:17:55 crc kubenswrapper[4835]: I1002 11:17:55.846872 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" podStartSLOduration=3.846849018 podStartE2EDuration="3.846849018s" podCreationTimestamp="2025-10-02 11:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:55.843142061 +0000 UTC m=+1352.403049662" watchObservedRunningTime="2025-10-02 11:17:55.846849018 +0000 UTC m=+1352.406756599" Oct 02 11:17:56 crc kubenswrapper[4835]: I1002 11:17:56.197876 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:56 crc kubenswrapper[4835]: I1002 11:17:56.198396 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerName="nova-api-api" containerID="cri-o://5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe" gracePeriod=30 Oct 02 11:17:56 crc kubenswrapper[4835]: I1002 11:17:56.198356 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerName="nova-api-log" containerID="cri-o://4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699" gracePeriod=30 Oct 02 11:17:56 crc kubenswrapper[4835]: I1002 11:17:56.844091 4835 generic.go:334] "Generic (PLEG): container finished" podID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerID="4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699" exitCode=143 Oct 02 11:17:56 crc kubenswrapper[4835]: I1002 11:17:56.844318 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135","Type":"ContainerDied","Data":"4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699"} Oct 02 11:17:58 crc kubenswrapper[4835]: I1002 11:17:58.864961 4835 generic.go:334] "Generic (PLEG): container finished" podID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerID="8ac6d400448de7d27c3d10a316dcb417bb140ccf5ad576d32452cdd647402c21" exitCode=0 Oct 02 11:17:58 crc kubenswrapper[4835]: I1002 11:17:58.865044 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c55deccf-4e69-437c-96a0-ff5f8200acad","Type":"ContainerDied","Data":"8ac6d400448de7d27c3d10a316dcb417bb140ccf5ad576d32452cdd647402c21"} Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.081883 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.180411 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-log-httpd\") pod \"c55deccf-4e69-437c-96a0-ff5f8200acad\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.180480 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-combined-ca-bundle\") pod \"c55deccf-4e69-437c-96a0-ff5f8200acad\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.180537 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-ceilometer-tls-certs\") pod \"c55deccf-4e69-437c-96a0-ff5f8200acad\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.180576 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-run-httpd\") pod \"c55deccf-4e69-437c-96a0-ff5f8200acad\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.180616 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-config-data\") pod \"c55deccf-4e69-437c-96a0-ff5f8200acad\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.180647 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-scripts\") pod \"c55deccf-4e69-437c-96a0-ff5f8200acad\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.180683 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf9n7\" (UniqueName: \"kubernetes.io/projected/c55deccf-4e69-437c-96a0-ff5f8200acad-kube-api-access-zf9n7\") pod \"c55deccf-4e69-437c-96a0-ff5f8200acad\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.180719 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-sg-core-conf-yaml\") pod \"c55deccf-4e69-437c-96a0-ff5f8200acad\" (UID: \"c55deccf-4e69-437c-96a0-ff5f8200acad\") " Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.181028 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c55deccf-4e69-437c-96a0-ff5f8200acad" (UID: "c55deccf-4e69-437c-96a0-ff5f8200acad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.181621 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c55deccf-4e69-437c-96a0-ff5f8200acad" (UID: "c55deccf-4e69-437c-96a0-ff5f8200acad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.181968 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.181989 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c55deccf-4e69-437c-96a0-ff5f8200acad-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.188947 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55deccf-4e69-437c-96a0-ff5f8200acad-kube-api-access-zf9n7" (OuterVolumeSpecName: "kube-api-access-zf9n7") pod "c55deccf-4e69-437c-96a0-ff5f8200acad" (UID: "c55deccf-4e69-437c-96a0-ff5f8200acad"). InnerVolumeSpecName "kube-api-access-zf9n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.191887 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-scripts" (OuterVolumeSpecName: "scripts") pod "c55deccf-4e69-437c-96a0-ff5f8200acad" (UID: "c55deccf-4e69-437c-96a0-ff5f8200acad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.244605 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c55deccf-4e69-437c-96a0-ff5f8200acad" (UID: "c55deccf-4e69-437c-96a0-ff5f8200acad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.254280 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c55deccf-4e69-437c-96a0-ff5f8200acad" (UID: "c55deccf-4e69-437c-96a0-ff5f8200acad"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.283933 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.283962 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.283973 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf9n7\" (UniqueName: \"kubernetes.io/projected/c55deccf-4e69-437c-96a0-ff5f8200acad-kube-api-access-zf9n7\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.283983 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.299523 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c55deccf-4e69-437c-96a0-ff5f8200acad" (UID: "c55deccf-4e69-437c-96a0-ff5f8200acad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.320800 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-config-data" (OuterVolumeSpecName: "config-data") pod "c55deccf-4e69-437c-96a0-ff5f8200acad" (UID: "c55deccf-4e69-437c-96a0-ff5f8200acad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.385901 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.386194 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55deccf-4e69-437c-96a0-ff5f8200acad-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.841926 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.903824 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c55deccf-4e69-437c-96a0-ff5f8200acad","Type":"ContainerDied","Data":"1ca52b1872c2b55bbbbfcf731c92d3abdcc2ad21260da534db7ec416f7535975"} Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.903874 4835 scope.go:117] "RemoveContainer" containerID="601466600cef5e7769a88427f9f413418f305505ce12141a1bce2b4155289332" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.904024 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.918463 4835 generic.go:334] "Generic (PLEG): container finished" podID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerID="5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe" exitCode=0 Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.918529 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.918554 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135","Type":"ContainerDied","Data":"5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe"} Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.918927 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135","Type":"ContainerDied","Data":"3f2797560da6ce0659736ae72c3a765f5aabcdcf727aefd12d8803226e10282b"} Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.959586 4835 scope.go:117] "RemoveContainer" containerID="b01a12e81a4c19d31d6b531b95cb02b753130af1ae872d5283d8ed399866c9d7" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.959773 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.972940 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.990171 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:17:59 crc kubenswrapper[4835]: E1002 11:17:59.990821 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerName="nova-api-api" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.990847 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerName="nova-api-api" Oct 02 11:17:59 crc kubenswrapper[4835]: E1002 11:17:59.990883 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="ceilometer-notification-agent" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.990890 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="ceilometer-notification-agent" Oct 02 11:17:59 crc kubenswrapper[4835]: E1002 11:17:59.990904 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="ceilometer-central-agent" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.990914 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="ceilometer-central-agent" Oct 02 11:17:59 crc kubenswrapper[4835]: E1002 11:17:59.990931 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerName="nova-api-log" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.990939 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerName="nova-api-log" Oct 02 11:17:59 crc kubenswrapper[4835]: E1002 11:17:59.990953 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="sg-core" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.990960 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="sg-core" Oct 02 11:17:59 crc kubenswrapper[4835]: E1002 11:17:59.990983 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="proxy-httpd" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.990990 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="proxy-httpd" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.991214 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="ceilometer-central-agent" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.991248 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="sg-core" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.991257 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerName="nova-api-api" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.991274 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" containerName="nova-api-log" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.991290 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="proxy-httpd" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.991306 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" containerName="ceilometer-notification-agent" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.993277 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.995912 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.996326 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:17:59 crc kubenswrapper[4835]: I1002 11:17:59.996855 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.000644 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.001159 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-logs\") pod \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.001244 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdn8v\" (UniqueName: \"kubernetes.io/projected/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-kube-api-access-cdn8v\") pod \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.001295 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-config-data\") pod \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.001419 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-combined-ca-bundle\") pod \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\" (UID: \"bc12c6e8-6ab3-4edc-999c-eaf70ad8d135\") " Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.002031 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-logs" (OuterVolumeSpecName: "logs") pod "bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" (UID: "bc12c6e8-6ab3-4edc-999c-eaf70ad8d135"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.008847 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-kube-api-access-cdn8v" (OuterVolumeSpecName: "kube-api-access-cdn8v") pod "bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" (UID: "bc12c6e8-6ab3-4edc-999c-eaf70ad8d135"). InnerVolumeSpecName "kube-api-access-cdn8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.049526 4835 scope.go:117] "RemoveContainer" containerID="8ac6d400448de7d27c3d10a316dcb417bb140ccf5ad576d32452cdd647402c21" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.056999 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" (UID: "bc12c6e8-6ab3-4edc-999c-eaf70ad8d135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.057805 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-config-data" (OuterVolumeSpecName: "config-data") pod "bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" (UID: "bc12c6e8-6ab3-4edc-999c-eaf70ad8d135"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104072 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-config-data\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104158 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv5qf\" (UniqueName: \"kubernetes.io/projected/e9bd12da-7335-4fac-84d4-36e3f674a435-kube-api-access-kv5qf\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104285 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104309 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104354 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-log-httpd\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104408 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104447 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-scripts\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104463 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-run-httpd\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104513 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdn8v\" (UniqueName: \"kubernetes.io/projected/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-kube-api-access-cdn8v\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104523 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104531 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.104539 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.149481 4835 scope.go:117] "RemoveContainer" containerID="15a28b4dada598d6695e48ab3b27a006a69d2c145d5661023700694f531efe80" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.183602 4835 scope.go:117] "RemoveContainer" containerID="5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.205599 4835 scope.go:117] "RemoveContainer" containerID="4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.205687 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-scripts\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.205729 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-run-httpd\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.205762 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-config-data\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.205810 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv5qf\" (UniqueName: \"kubernetes.io/projected/e9bd12da-7335-4fac-84d4-36e3f674a435-kube-api-access-kv5qf\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.205892 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.205924 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.205947 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-log-httpd\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.206020 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.206847 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-run-httpd\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.206988 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-log-httpd\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.209963 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.210917 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-config-data\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.211627 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-scripts\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.216437 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.216549 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.225465 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv5qf\" (UniqueName: \"kubernetes.io/projected/e9bd12da-7335-4fac-84d4-36e3f674a435-kube-api-access-kv5qf\") pod \"ceilometer-0\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.237404 4835 scope.go:117] "RemoveContainer" containerID="5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe" Oct 02 11:18:00 crc kubenswrapper[4835]: E1002 11:18:00.237735 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe\": container with ID starting with 5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe not found: ID does not exist" containerID="5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.237842 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe"} err="failed to get container status \"5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe\": rpc error: code = NotFound desc = could not find container \"5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe\": container with ID starting with 5dc1b9f716e1b6db83aff52f7ffc59f6a6ed1023532e4a7ebfd4741daa6fffbe not found: ID does not exist" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.237932 4835 scope.go:117] "RemoveContainer" containerID="4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699" Oct 02 11:18:00 crc kubenswrapper[4835]: E1002 11:18:00.238240 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699\": container with ID starting with 4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699 not found: ID does not exist" containerID="4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.238265 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699"} err="failed to get container status \"4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699\": rpc error: code = NotFound desc = could not find container \"4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699\": container with ID starting with 4eb394fc8f57b057cd4c6eff75bbf8dbb99b641c1d42ecaa3c94fc6775b8e699 not found: ID does not exist" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.267933 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55deccf-4e69-437c-96a0-ff5f8200acad" path="/var/lib/kubelet/pods/c55deccf-4e69-437c-96a0-ff5f8200acad/volumes" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.270146 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.290824 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.324648 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.328137 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.330887 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.332606 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.332718 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.359368 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.423351 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6r9x\" (UniqueName: \"kubernetes.io/projected/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-kube-api-access-d6r9x\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.423713 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.424729 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.424831 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.425378 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-config-data\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.425418 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-logs\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.440982 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.527000 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.527075 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.527184 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-config-data\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.527210 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-logs\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.527297 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6r9x\" (UniqueName: \"kubernetes.io/projected/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-kube-api-access-d6r9x\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.527867 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-logs\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.528047 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.531923 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.531924 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.536297 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.541253 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-config-data\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.552758 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6r9x\" (UniqueName: \"kubernetes.io/projected/2efae1cc-e8ac-43fd-bb26-6e8897e916f8-kube-api-access-d6r9x\") pod \"nova-api-0\" (UID: \"2efae1cc-e8ac-43fd-bb26-6e8897e916f8\") " pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.672054 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:18:00 crc kubenswrapper[4835]: I1002 11:18:00.965110 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:18:01 crc kubenswrapper[4835]: I1002 11:18:01.000444 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:18:01 crc kubenswrapper[4835]: I1002 11:18:01.235409 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:18:01 crc kubenswrapper[4835]: I1002 11:18:01.946940 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2efae1cc-e8ac-43fd-bb26-6e8897e916f8","Type":"ContainerStarted","Data":"382f43b6449c6072fd35983858c4840bc1f917aff5c32301137cb95157c514f4"} Oct 02 11:18:01 crc kubenswrapper[4835]: I1002 11:18:01.948435 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2efae1cc-e8ac-43fd-bb26-6e8897e916f8","Type":"ContainerStarted","Data":"60e016b3a6b3a887265d243c2c2943d1ea8f1d85cb9e830db2eb87af9c997bd3"} Oct 02 11:18:01 crc kubenswrapper[4835]: I1002 11:18:01.949393 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9bd12da-7335-4fac-84d4-36e3f674a435","Type":"ContainerStarted","Data":"b3c6629dde6c9f7c2f8657e6a652fe3a96a91efa411f74760093e9d8106b584f"} Oct 02 11:18:02 crc kubenswrapper[4835]: I1002 11:18:02.263351 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc12c6e8-6ab3-4edc-999c-eaf70ad8d135" path="/var/lib/kubelet/pods/bc12c6e8-6ab3-4edc-999c-eaf70ad8d135/volumes" Oct 02 11:18:02 crc kubenswrapper[4835]: I1002 11:18:02.958797 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9bd12da-7335-4fac-84d4-36e3f674a435","Type":"ContainerStarted","Data":"c5f41bf19843de65c7058a557f1bb07691dfa9cbab75c48bfb2e354fa6c428d4"} Oct 02 11:18:02 crc kubenswrapper[4835]: I1002 11:18:02.959236 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9bd12da-7335-4fac-84d4-36e3f674a435","Type":"ContainerStarted","Data":"0c5061c706e1924d68bd435577b3a879924ba6a43d453f8054d879c4d2769b1c"} Oct 02 11:18:02 crc kubenswrapper[4835]: I1002 11:18:02.964456 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2efae1cc-e8ac-43fd-bb26-6e8897e916f8","Type":"ContainerStarted","Data":"956c2ab5784ec21057e27296e3f2946e323952dda6a21b456f83fa6516a7b464"} Oct 02 11:18:03 crc kubenswrapper[4835]: I1002 11:18:03.292623 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:18:03 crc kubenswrapper[4835]: I1002 11:18:03.333419 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.333391183 podStartE2EDuration="3.333391183s" podCreationTimestamp="2025-10-02 11:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:18:02.991848459 +0000 UTC m=+1359.551756040" watchObservedRunningTime="2025-10-02 11:18:03.333391183 +0000 UTC m=+1359.893298774" Oct 02 11:18:03 crc kubenswrapper[4835]: I1002 11:18:03.360054 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-78gl4"] Oct 02 11:18:03 crc kubenswrapper[4835]: I1002 11:18:03.360541 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" podUID="18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" containerName="dnsmasq-dns" containerID="cri-o://48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42" gracePeriod=10 Oct 02 11:18:03 crc kubenswrapper[4835]: I1002 11:18:03.903730 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:18:03 crc kubenswrapper[4835]: I1002 11:18:03.976263 4835 generic.go:334] "Generic (PLEG): container finished" podID="18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" containerID="48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42" exitCode=0 Oct 02 11:18:03 crc kubenswrapper[4835]: I1002 11:18:03.976360 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" Oct 02 11:18:03 crc kubenswrapper[4835]: I1002 11:18:03.976389 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" event={"ID":"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d","Type":"ContainerDied","Data":"48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42"} Oct 02 11:18:03 crc kubenswrapper[4835]: I1002 11:18:03.976863 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-78gl4" event={"ID":"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d","Type":"ContainerDied","Data":"1581cd2366cf30f69c6d0205918d39498e243376a66fc79aefe42610ff40dd09"} Oct 02 11:18:03 crc kubenswrapper[4835]: I1002 11:18:03.976891 4835 scope.go:117] "RemoveContainer" containerID="48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42" Oct 02 11:18:03 crc kubenswrapper[4835]: I1002 11:18:03.986280 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9bd12da-7335-4fac-84d4-36e3f674a435","Type":"ContainerStarted","Data":"1fabad7a7dc9d9d6c217458ff6c6d9d451e0d3bf0b14f2b952653e46833449bd"} Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.006620 4835 scope.go:117] "RemoveContainer" containerID="8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.009034 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-nb\") pod \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.009138 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-dns-svc\") pod \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.009320 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-config\") pod \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.009450 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk4t2\" (UniqueName: \"kubernetes.io/projected/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-kube-api-access-xk4t2\") pod \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.009550 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-sb\") pod \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\" (UID: \"18af06c7-589a-4ddb-aa4f-92ddfb5ed95d\") " Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.019924 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-kube-api-access-xk4t2" (OuterVolumeSpecName: "kube-api-access-xk4t2") pod "18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" (UID: "18af06c7-589a-4ddb-aa4f-92ddfb5ed95d"). InnerVolumeSpecName "kube-api-access-xk4t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.037992 4835 scope.go:117] "RemoveContainer" containerID="48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42" Oct 02 11:18:04 crc kubenswrapper[4835]: E1002 11:18:04.038486 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42\": container with ID starting with 48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42 not found: ID does not exist" containerID="48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.038555 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42"} err="failed to get container status \"48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42\": rpc error: code = NotFound desc = could not find container \"48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42\": container with ID starting with 48da7a0d34c25d75aa8d0f9ae264bcd829853d22f7a532a2b0942b9d7d467e42 not found: ID does not exist" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.038587 4835 scope.go:117] "RemoveContainer" containerID="8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a" Oct 02 11:18:04 crc kubenswrapper[4835]: E1002 11:18:04.039249 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a\": container with ID starting with 8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a not found: ID does not exist" containerID="8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.039328 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a"} err="failed to get container status \"8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a\": rpc error: code = NotFound desc = could not find container \"8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a\": container with ID starting with 8ba0962de50ab694554bc560dfc9dc6b51092772fd59c094303c0391098fc78a not found: ID does not exist" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.066055 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" (UID: "18af06c7-589a-4ddb-aa4f-92ddfb5ed95d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.072470 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-config" (OuterVolumeSpecName: "config") pod "18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" (UID: "18af06c7-589a-4ddb-aa4f-92ddfb5ed95d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.072967 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" (UID: "18af06c7-589a-4ddb-aa4f-92ddfb5ed95d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.082784 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" (UID: "18af06c7-589a-4ddb-aa4f-92ddfb5ed95d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.113182 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk4t2\" (UniqueName: \"kubernetes.io/projected/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-kube-api-access-xk4t2\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.113215 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.113240 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.113251 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.113264 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.329504 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-78gl4"] Oct 02 11:18:04 crc kubenswrapper[4835]: I1002 11:18:04.338764 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-78gl4"] Oct 02 11:18:06 crc kubenswrapper[4835]: I1002 11:18:06.017315 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9bd12da-7335-4fac-84d4-36e3f674a435","Type":"ContainerStarted","Data":"75822ef3154de0aba12eb764accc619ff70ad5870583ee58753a92f4e7648898"} Oct 02 11:18:06 crc kubenswrapper[4835]: I1002 11:18:06.017601 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:18:06 crc kubenswrapper[4835]: I1002 11:18:06.041075 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.558104318 podStartE2EDuration="7.041054634s" podCreationTimestamp="2025-10-02 11:17:59 +0000 UTC" firstStartedPulling="2025-10-02 11:18:01.00025422 +0000 UTC m=+1357.560161811" lastFinishedPulling="2025-10-02 11:18:05.483204536 +0000 UTC m=+1362.043112127" observedRunningTime="2025-10-02 11:18:06.034427712 +0000 UTC m=+1362.594335313" watchObservedRunningTime="2025-10-02 11:18:06.041054634 +0000 UTC m=+1362.600962215" Oct 02 11:18:06 crc kubenswrapper[4835]: I1002 11:18:06.269897 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" path="/var/lib/kubelet/pods/18af06c7-589a-4ddb-aa4f-92ddfb5ed95d/volumes" Oct 02 11:18:10 crc kubenswrapper[4835]: I1002 11:18:10.672316 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:18:10 crc kubenswrapper[4835]: I1002 11:18:10.672809 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:18:11 crc kubenswrapper[4835]: I1002 11:18:11.686500 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2efae1cc-e8ac-43fd-bb26-6e8897e916f8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:18:11 crc kubenswrapper[4835]: I1002 11:18:11.686500 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2efae1cc-e8ac-43fd-bb26-6e8897e916f8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:18:20 crc kubenswrapper[4835]: I1002 11:18:20.679970 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:18:20 crc kubenswrapper[4835]: I1002 11:18:20.680604 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:18:20 crc kubenswrapper[4835]: I1002 11:18:20.680939 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:18:20 crc kubenswrapper[4835]: I1002 11:18:20.680988 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:18:20 crc kubenswrapper[4835]: I1002 11:18:20.687440 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:18:20 crc kubenswrapper[4835]: I1002 11:18:20.689556 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:18:30 crc kubenswrapper[4835]: I1002 11:18:30.452912 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:18:41 crc kubenswrapper[4835]: I1002 11:18:41.883602 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:18:43 crc kubenswrapper[4835]: I1002 11:18:43.100623 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:18:46 crc kubenswrapper[4835]: I1002 11:18:46.042201 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="75581788-2dfd-41d9-8500-0b4e3d050cab" containerName="rabbitmq" containerID="cri-o://c7d6fcf1360156f5f0d76fb0f5cdea23c360623fbf19f383a3934656ea10dc9b" gracePeriod=604796 Oct 02 11:18:47 crc kubenswrapper[4835]: I1002 11:18:47.202594 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="6bc536a4-ef50-4d6d-aca5-a030ce38ce24" containerName="rabbitmq" containerID="cri-o://a40271397664aed96ba97530aa140906e693827bd323db569c29450f0806063b" gracePeriod=604796 Oct 02 11:18:50 crc kubenswrapper[4835]: I1002 11:18:50.757469 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="75581788-2dfd-41d9-8500-0b4e3d050cab" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Oct 02 11:18:51 crc kubenswrapper[4835]: I1002 11:18:51.050798 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="6bc536a4-ef50-4d6d-aca5-a030ce38ce24" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.446713 4835 generic.go:334] "Generic (PLEG): container finished" podID="75581788-2dfd-41d9-8500-0b4e3d050cab" containerID="c7d6fcf1360156f5f0d76fb0f5cdea23c360623fbf19f383a3934656ea10dc9b" exitCode=0 Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.446789 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75581788-2dfd-41d9-8500-0b4e3d050cab","Type":"ContainerDied","Data":"c7d6fcf1360156f5f0d76fb0f5cdea23c360623fbf19f383a3934656ea10dc9b"} Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.582954 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.658495 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-server-conf\") pod \"75581788-2dfd-41d9-8500-0b4e3d050cab\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.658581 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"75581788-2dfd-41d9-8500-0b4e3d050cab\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.658606 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-confd\") pod \"75581788-2dfd-41d9-8500-0b4e3d050cab\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.658630 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-plugins\") pod \"75581788-2dfd-41d9-8500-0b4e3d050cab\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.658657 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-plugins-conf\") pod \"75581788-2dfd-41d9-8500-0b4e3d050cab\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.658692 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75581788-2dfd-41d9-8500-0b4e3d050cab-erlang-cookie-secret\") pod \"75581788-2dfd-41d9-8500-0b4e3d050cab\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.658736 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75581788-2dfd-41d9-8500-0b4e3d050cab-pod-info\") pod \"75581788-2dfd-41d9-8500-0b4e3d050cab\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.658776 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-tls\") pod \"75581788-2dfd-41d9-8500-0b4e3d050cab\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.658821 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-erlang-cookie\") pod \"75581788-2dfd-41d9-8500-0b4e3d050cab\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.658872 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxkn4\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-kube-api-access-qxkn4\") pod \"75581788-2dfd-41d9-8500-0b4e3d050cab\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.658960 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-config-data\") pod \"75581788-2dfd-41d9-8500-0b4e3d050cab\" (UID: \"75581788-2dfd-41d9-8500-0b4e3d050cab\") " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.659001 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "75581788-2dfd-41d9-8500-0b4e3d050cab" (UID: "75581788-2dfd-41d9-8500-0b4e3d050cab"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.659518 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.659629 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "75581788-2dfd-41d9-8500-0b4e3d050cab" (UID: "75581788-2dfd-41d9-8500-0b4e3d050cab"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.666200 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75581788-2dfd-41d9-8500-0b4e3d050cab-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "75581788-2dfd-41d9-8500-0b4e3d050cab" (UID: "75581788-2dfd-41d9-8500-0b4e3d050cab"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.666716 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-kube-api-access-qxkn4" (OuterVolumeSpecName: "kube-api-access-qxkn4") pod "75581788-2dfd-41d9-8500-0b4e3d050cab" (UID: "75581788-2dfd-41d9-8500-0b4e3d050cab"). InnerVolumeSpecName "kube-api-access-qxkn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.668038 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/75581788-2dfd-41d9-8500-0b4e3d050cab-pod-info" (OuterVolumeSpecName: "pod-info") pod "75581788-2dfd-41d9-8500-0b4e3d050cab" (UID: "75581788-2dfd-41d9-8500-0b4e3d050cab"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.669457 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "75581788-2dfd-41d9-8500-0b4e3d050cab" (UID: "75581788-2dfd-41d9-8500-0b4e3d050cab"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.671678 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "75581788-2dfd-41d9-8500-0b4e3d050cab" (UID: "75581788-2dfd-41d9-8500-0b4e3d050cab"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.761898 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxkn4\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-kube-api-access-qxkn4\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.762317 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.762411 4835 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75581788-2dfd-41d9-8500-0b4e3d050cab-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.762484 4835 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75581788-2dfd-41d9-8500-0b4e3d050cab-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.762559 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.762627 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.797282 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "75581788-2dfd-41d9-8500-0b4e3d050cab" (UID: "75581788-2dfd-41d9-8500-0b4e3d050cab"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.797948 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-config-data" (OuterVolumeSpecName: "config-data") pod "75581788-2dfd-41d9-8500-0b4e3d050cab" (UID: "75581788-2dfd-41d9-8500-0b4e3d050cab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.828107 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "75581788-2dfd-41d9-8500-0b4e3d050cab" (UID: "75581788-2dfd-41d9-8500-0b4e3d050cab"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.835628 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.836995 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-server-conf" (OuterVolumeSpecName: "server-conf") pod "75581788-2dfd-41d9-8500-0b4e3d050cab" (UID: "75581788-2dfd-41d9-8500-0b4e3d050cab"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.865254 4835 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.865290 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.865298 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75581788-2dfd-41d9-8500-0b4e3d050cab-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.865309 4835 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:52 crc kubenswrapper[4835]: I1002 11:18:52.865317 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75581788-2dfd-41d9-8500-0b4e3d050cab-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.462028 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75581788-2dfd-41d9-8500-0b4e3d050cab","Type":"ContainerDied","Data":"d142e42e7bed7a3468134867521820f8354d04599870b2c2298e6578d13356e7"} Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.462323 4835 scope.go:117] "RemoveContainer" containerID="c7d6fcf1360156f5f0d76fb0f5cdea23c360623fbf19f383a3934656ea10dc9b" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.462044 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.468678 4835 generic.go:334] "Generic (PLEG): container finished" podID="6bc536a4-ef50-4d6d-aca5-a030ce38ce24" containerID="a40271397664aed96ba97530aa140906e693827bd323db569c29450f0806063b" exitCode=0 Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.468716 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6bc536a4-ef50-4d6d-aca5-a030ce38ce24","Type":"ContainerDied","Data":"a40271397664aed96ba97530aa140906e693827bd323db569c29450f0806063b"} Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.484600 4835 scope.go:117] "RemoveContainer" containerID="37458b76b2ffa63de5c4a5fa2995e72d004e9c5d9c332ebf588cdfc9661d408e" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.508374 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.515361 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.546302 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:18:53 crc kubenswrapper[4835]: E1002 11:18:53.546974 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" containerName="dnsmasq-dns" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.546998 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" containerName="dnsmasq-dns" Oct 02 11:18:53 crc kubenswrapper[4835]: E1002 11:18:53.547030 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" containerName="init" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.547039 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" containerName="init" Oct 02 11:18:53 crc kubenswrapper[4835]: E1002 11:18:53.547057 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75581788-2dfd-41d9-8500-0b4e3d050cab" containerName="rabbitmq" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.547065 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="75581788-2dfd-41d9-8500-0b4e3d050cab" containerName="rabbitmq" Oct 02 11:18:53 crc kubenswrapper[4835]: E1002 11:18:53.547085 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75581788-2dfd-41d9-8500-0b4e3d050cab" containerName="setup-container" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.547092 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="75581788-2dfd-41d9-8500-0b4e3d050cab" containerName="setup-container" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.548328 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="75581788-2dfd-41d9-8500-0b4e3d050cab" containerName="rabbitmq" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.548373 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="18af06c7-589a-4ddb-aa4f-92ddfb5ed95d" containerName="dnsmasq-dns" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.549579 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.553455 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.557044 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gnqgj" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.557326 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.557633 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.557946 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.558162 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.564005 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.565804 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.685891 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.685945 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.685983 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.686008 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/743bfb49-5459-4911-8eee-4bb313368c21-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.686521 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.686636 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/743bfb49-5459-4911-8eee-4bb313368c21-server-conf\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.686763 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.686796 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/743bfb49-5459-4911-8eee-4bb313368c21-config-data\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.686816 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/743bfb49-5459-4911-8eee-4bb313368c21-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.686844 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9qfg\" (UniqueName: \"kubernetes.io/projected/743bfb49-5459-4911-8eee-4bb313368c21-kube-api-access-g9qfg\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.686992 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/743bfb49-5459-4911-8eee-4bb313368c21-pod-info\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.789330 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/743bfb49-5459-4911-8eee-4bb313368c21-pod-info\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.789413 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.789443 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.789476 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.789497 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/743bfb49-5459-4911-8eee-4bb313368c21-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.789557 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.789587 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/743bfb49-5459-4911-8eee-4bb313368c21-server-conf\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.789638 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.789657 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/743bfb49-5459-4911-8eee-4bb313368c21-config-data\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.789673 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/743bfb49-5459-4911-8eee-4bb313368c21-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.789695 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9qfg\" (UniqueName: \"kubernetes.io/projected/743bfb49-5459-4911-8eee-4bb313368c21-kube-api-access-g9qfg\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.796043 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.796084 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/743bfb49-5459-4911-8eee-4bb313368c21-config-data\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.796319 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.796918 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.796965 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/743bfb49-5459-4911-8eee-4bb313368c21-pod-info\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.797573 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/743bfb49-5459-4911-8eee-4bb313368c21-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.797803 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/743bfb49-5459-4911-8eee-4bb313368c21-server-conf\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.798262 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.800192 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/743bfb49-5459-4911-8eee-4bb313368c21-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.802233 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/743bfb49-5459-4911-8eee-4bb313368c21-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.813260 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9qfg\" (UniqueName: \"kubernetes.io/projected/743bfb49-5459-4911-8eee-4bb313368c21-kube-api-access-g9qfg\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.836787 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"743bfb49-5459-4911-8eee-4bb313368c21\") " pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.871991 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.872971 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993163 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-plugins\") pod \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993236 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-erlang-cookie\") pod \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993284 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-config-data\") pod \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993298 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-plugins-conf\") pod \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993357 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdtl6\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-kube-api-access-pdtl6\") pod \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993373 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-server-conf\") pod \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993389 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993413 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-tls\") pod \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993595 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-pod-info\") pod \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993623 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-confd\") pod \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993619 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6bc536a4-ef50-4d6d-aca5-a030ce38ce24" (UID: "6bc536a4-ef50-4d6d-aca5-a030ce38ce24"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.993786 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-erlang-cookie-secret\") pod \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\" (UID: \"6bc536a4-ef50-4d6d-aca5-a030ce38ce24\") " Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.994259 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:53 crc kubenswrapper[4835]: I1002 11:18:53.997854 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6bc536a4-ef50-4d6d-aca5-a030ce38ce24" (UID: "6bc536a4-ef50-4d6d-aca5-a030ce38ce24"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:53.999344 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6bc536a4-ef50-4d6d-aca5-a030ce38ce24" (UID: "6bc536a4-ef50-4d6d-aca5-a030ce38ce24"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.000457 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6bc536a4-ef50-4d6d-aca5-a030ce38ce24" (UID: "6bc536a4-ef50-4d6d-aca5-a030ce38ce24"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.001422 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6bc536a4-ef50-4d6d-aca5-a030ce38ce24" (UID: "6bc536a4-ef50-4d6d-aca5-a030ce38ce24"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.001637 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-kube-api-access-pdtl6" (OuterVolumeSpecName: "kube-api-access-pdtl6") pod "6bc536a4-ef50-4d6d-aca5-a030ce38ce24" (UID: "6bc536a4-ef50-4d6d-aca5-a030ce38ce24"). InnerVolumeSpecName "kube-api-access-pdtl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.002030 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "6bc536a4-ef50-4d6d-aca5-a030ce38ce24" (UID: "6bc536a4-ef50-4d6d-aca5-a030ce38ce24"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.008929 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-pod-info" (OuterVolumeSpecName: "pod-info") pod "6bc536a4-ef50-4d6d-aca5-a030ce38ce24" (UID: "6bc536a4-ef50-4d6d-aca5-a030ce38ce24"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.029157 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-config-data" (OuterVolumeSpecName: "config-data") pod "6bc536a4-ef50-4d6d-aca5-a030ce38ce24" (UID: "6bc536a4-ef50-4d6d-aca5-a030ce38ce24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.059177 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-server-conf" (OuterVolumeSpecName: "server-conf") pod "6bc536a4-ef50-4d6d-aca5-a030ce38ce24" (UID: "6bc536a4-ef50-4d6d-aca5-a030ce38ce24"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.096640 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdtl6\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-kube-api-access-pdtl6\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.096678 4835 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.096715 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.096729 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.096740 4835 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.096748 4835 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.096757 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.096768 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.096776 4835 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.104785 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6bc536a4-ef50-4d6d-aca5-a030ce38ce24" (UID: "6bc536a4-ef50-4d6d-aca5-a030ce38ce24"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.123327 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.198919 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.199258 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bc536a4-ef50-4d6d-aca5-a030ce38ce24-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.272934 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75581788-2dfd-41d9-8500-0b4e3d050cab" path="/var/lib/kubelet/pods/75581788-2dfd-41d9-8500-0b4e3d050cab/volumes" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.357624 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.481142 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"743bfb49-5459-4911-8eee-4bb313368c21","Type":"ContainerStarted","Data":"4c21d5f9e04c038abd266d5987fa15ca44dd0da48c5cbe2d0fceae834828acd7"} Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.483843 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6bc536a4-ef50-4d6d-aca5-a030ce38ce24","Type":"ContainerDied","Data":"e48e57d1da6c9e9c0484b04a239321bc2b9cfefdf19ce428af30c5d39dfd9d41"} Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.483885 4835 scope.go:117] "RemoveContainer" containerID="a40271397664aed96ba97530aa140906e693827bd323db569c29450f0806063b" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.483964 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.525924 4835 scope.go:117] "RemoveContainer" containerID="0ef1712aa04a0792cf4b9650430c005225c98aa32da76b73bd349ff12c229cbd" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.529720 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.541293 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.556212 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:18:54 crc kubenswrapper[4835]: E1002 11:18:54.556959 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc536a4-ef50-4d6d-aca5-a030ce38ce24" containerName="rabbitmq" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.557031 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc536a4-ef50-4d6d-aca5-a030ce38ce24" containerName="rabbitmq" Oct 02 11:18:54 crc kubenswrapper[4835]: E1002 11:18:54.557105 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc536a4-ef50-4d6d-aca5-a030ce38ce24" containerName="setup-container" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.557171 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc536a4-ef50-4d6d-aca5-a030ce38ce24" containerName="setup-container" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.557427 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc536a4-ef50-4d6d-aca5-a030ce38ce24" containerName="rabbitmq" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.558891 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.562137 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cm8f9" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.562412 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.562644 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.562777 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.562857 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.563095 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.563259 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.567411 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.707427 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.707744 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.707898 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.708045 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.708131 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.708316 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.708411 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.708508 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.708567 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7vlh\" (UniqueName: \"kubernetes.io/projected/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-kube-api-access-f7vlh\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.708728 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.708769 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.810081 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.810425 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.810456 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7vlh\" (UniqueName: \"kubernetes.io/projected/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-kube-api-access-f7vlh\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.810510 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.810529 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.810558 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.810579 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.810613 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.810650 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.810668 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.810686 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.811119 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.811408 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.811532 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.811927 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.811968 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.813226 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.815559 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.816444 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.817536 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.818261 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.829401 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7vlh\" (UniqueName: \"kubernetes.io/projected/74ecc09a-8044-49d6-8c9b-2cbcc56d9612-kube-api-access-f7vlh\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.845842 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"74ecc09a-8044-49d6-8c9b-2cbcc56d9612\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:54 crc kubenswrapper[4835]: I1002 11:18:54.908746 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:55 crc kubenswrapper[4835]: I1002 11:18:55.408949 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:18:55 crc kubenswrapper[4835]: W1002 11:18:55.415798 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74ecc09a_8044_49d6_8c9b_2cbcc56d9612.slice/crio-f5c7b453a0fdcc27f18386a421ba3769d8eb6dad74b791ee27a03f4bcda9311a WatchSource:0}: Error finding container f5c7b453a0fdcc27f18386a421ba3769d8eb6dad74b791ee27a03f4bcda9311a: Status 404 returned error can't find the container with id f5c7b453a0fdcc27f18386a421ba3769d8eb6dad74b791ee27a03f4bcda9311a Oct 02 11:18:55 crc kubenswrapper[4835]: I1002 11:18:55.494491 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"74ecc09a-8044-49d6-8c9b-2cbcc56d9612","Type":"ContainerStarted","Data":"f5c7b453a0fdcc27f18386a421ba3769d8eb6dad74b791ee27a03f4bcda9311a"} Oct 02 11:18:56 crc kubenswrapper[4835]: I1002 11:18:56.266790 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc536a4-ef50-4d6d-aca5-a030ce38ce24" path="/var/lib/kubelet/pods/6bc536a4-ef50-4d6d-aca5-a030ce38ce24/volumes" Oct 02 11:18:56 crc kubenswrapper[4835]: I1002 11:18:56.509949 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"743bfb49-5459-4911-8eee-4bb313368c21","Type":"ContainerStarted","Data":"7de499bd43b507a25bdaa838ae06485a956df0d98ac26d9c1471649bd968b1ac"} Oct 02 11:18:57 crc kubenswrapper[4835]: I1002 11:18:57.521753 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"74ecc09a-8044-49d6-8c9b-2cbcc56d9612","Type":"ContainerStarted","Data":"6ee8fec3c6bd26691c3b2a3686e11dae982241a76e21367bbe21ad4d7300033e"} Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.369233 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2g8xw"] Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.371106 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.373955 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.384720 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.384785 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.384841 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.384871 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.384967 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfstn\" (UniqueName: \"kubernetes.io/projected/fc024136-4d73-471e-b1c6-86229c8264fa-kube-api-access-hfstn\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.385114 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-config\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.390727 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2g8xw"] Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.486722 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-config\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.486861 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.486889 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.486917 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.486935 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.486973 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfstn\" (UniqueName: \"kubernetes.io/projected/fc024136-4d73-471e-b1c6-86229c8264fa-kube-api-access-hfstn\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.487735 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-config\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.488102 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.489994 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.490083 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.490141 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.507591 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfstn\" (UniqueName: \"kubernetes.io/projected/fc024136-4d73-471e-b1c6-86229c8264fa-kube-api-access-hfstn\") pod \"dnsmasq-dns-6447ccbd8f-2g8xw\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:58 crc kubenswrapper[4835]: I1002 11:18:58.690282 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:18:59 crc kubenswrapper[4835]: I1002 11:18:59.179738 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2g8xw"] Oct 02 11:18:59 crc kubenswrapper[4835]: W1002 11:18:59.181848 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc024136_4d73_471e_b1c6_86229c8264fa.slice/crio-d20f042ba6fa7b23b81ac758de2b8c2799b0ef34a985a9a8e478d5a6f6290ff4 WatchSource:0}: Error finding container d20f042ba6fa7b23b81ac758de2b8c2799b0ef34a985a9a8e478d5a6f6290ff4: Status 404 returned error can't find the container with id d20f042ba6fa7b23b81ac758de2b8c2799b0ef34a985a9a8e478d5a6f6290ff4 Oct 02 11:18:59 crc kubenswrapper[4835]: I1002 11:18:59.542967 4835 generic.go:334] "Generic (PLEG): container finished" podID="fc024136-4d73-471e-b1c6-86229c8264fa" containerID="fdee90d3ebdc9eb5b237720a313faf11147d95931b3fc5f641316a2ea9cb676b" exitCode=0 Oct 02 11:18:59 crc kubenswrapper[4835]: I1002 11:18:59.543043 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" event={"ID":"fc024136-4d73-471e-b1c6-86229c8264fa","Type":"ContainerDied","Data":"fdee90d3ebdc9eb5b237720a313faf11147d95931b3fc5f641316a2ea9cb676b"} Oct 02 11:18:59 crc kubenswrapper[4835]: I1002 11:18:59.543425 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" event={"ID":"fc024136-4d73-471e-b1c6-86229c8264fa","Type":"ContainerStarted","Data":"d20f042ba6fa7b23b81ac758de2b8c2799b0ef34a985a9a8e478d5a6f6290ff4"} Oct 02 11:19:00 crc kubenswrapper[4835]: I1002 11:19:00.557580 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" event={"ID":"fc024136-4d73-471e-b1c6-86229c8264fa","Type":"ContainerStarted","Data":"484c49d26816d65fc222f1890a428d13caa81de3af992b8148ce1646a1daa389"} Oct 02 11:19:00 crc kubenswrapper[4835]: I1002 11:19:00.558578 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:19:00 crc kubenswrapper[4835]: I1002 11:19:00.586671 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" podStartSLOduration=2.586637287 podStartE2EDuration="2.586637287s" podCreationTimestamp="2025-10-02 11:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:19:00.581359904 +0000 UTC m=+1417.141267655" watchObservedRunningTime="2025-10-02 11:19:00.586637287 +0000 UTC m=+1417.146544888" Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.691580 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.753506 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mp7rf"] Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.753792 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" podUID="87ac457e-e0c6-4108-b1f0-4eaae559d4a5" containerName="dnsmasq-dns" containerID="cri-o://07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570" gracePeriod=10 Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.957195 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-dsbm5"] Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.959527 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.968770 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-dsbm5"] Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.997060 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qtgh\" (UniqueName: \"kubernetes.io/projected/b8894eb8-5eee-48da-81e5-1a98616c3a1f-kube-api-access-9qtgh\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.997586 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.997614 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.997644 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.997698 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:08 crc kubenswrapper[4835]: I1002 11:19:08.997740 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-config\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.099608 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qtgh\" (UniqueName: \"kubernetes.io/projected/b8894eb8-5eee-48da-81e5-1a98616c3a1f-kube-api-access-9qtgh\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.099688 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.099719 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.099747 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.099803 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.099864 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-config\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.100905 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-config\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.101555 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.101566 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.102190 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.102642 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.137950 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qtgh\" (UniqueName: \"kubernetes.io/projected/b8894eb8-5eee-48da-81e5-1a98616c3a1f-kube-api-access-9qtgh\") pod \"dnsmasq-dns-864d5fc68c-dsbm5\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.281659 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.400932 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.405178 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8k6h\" (UniqueName: \"kubernetes.io/projected/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-kube-api-access-r8k6h\") pod \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.405240 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-dns-svc\") pod \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.405303 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-config\") pod \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.405317 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-sb\") pod \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.405386 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-nb\") pod \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\" (UID: \"87ac457e-e0c6-4108-b1f0-4eaae559d4a5\") " Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.412958 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-kube-api-access-r8k6h" (OuterVolumeSpecName: "kube-api-access-r8k6h") pod "87ac457e-e0c6-4108-b1f0-4eaae559d4a5" (UID: "87ac457e-e0c6-4108-b1f0-4eaae559d4a5"). InnerVolumeSpecName "kube-api-access-r8k6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.464151 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87ac457e-e0c6-4108-b1f0-4eaae559d4a5" (UID: "87ac457e-e0c6-4108-b1f0-4eaae559d4a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.475367 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87ac457e-e0c6-4108-b1f0-4eaae559d4a5" (UID: "87ac457e-e0c6-4108-b1f0-4eaae559d4a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.478275 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-config" (OuterVolumeSpecName: "config") pod "87ac457e-e0c6-4108-b1f0-4eaae559d4a5" (UID: "87ac457e-e0c6-4108-b1f0-4eaae559d4a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.482884 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87ac457e-e0c6-4108-b1f0-4eaae559d4a5" (UID: "87ac457e-e0c6-4108-b1f0-4eaae559d4a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.507821 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8k6h\" (UniqueName: \"kubernetes.io/projected/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-kube-api-access-r8k6h\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.507871 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.507883 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.507893 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.507903 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87ac457e-e0c6-4108-b1f0-4eaae559d4a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.656531 4835 generic.go:334] "Generic (PLEG): container finished" podID="87ac457e-e0c6-4108-b1f0-4eaae559d4a5" containerID="07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570" exitCode=0 Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.656605 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" event={"ID":"87ac457e-e0c6-4108-b1f0-4eaae559d4a5","Type":"ContainerDied","Data":"07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570"} Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.656620 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.656638 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-mp7rf" event={"ID":"87ac457e-e0c6-4108-b1f0-4eaae559d4a5","Type":"ContainerDied","Data":"90415390f72e5a76481afe4210904baec79efc9e4faf7d68d9d3b21132688495"} Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.656660 4835 scope.go:117] "RemoveContainer" containerID="07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.693995 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mp7rf"] Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.694129 4835 scope.go:117] "RemoveContainer" containerID="08d690cf90c98fd3d92b4d679476ed8c43a53dfe606ef2708a5b0b31c98c7685" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.705544 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mp7rf"] Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.733943 4835 scope.go:117] "RemoveContainer" containerID="07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570" Oct 02 11:19:09 crc kubenswrapper[4835]: E1002 11:19:09.734452 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570\": container with ID starting with 07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570 not found: ID does not exist" containerID="07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.734497 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570"} err="failed to get container status \"07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570\": rpc error: code = NotFound desc = could not find container \"07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570\": container with ID starting with 07f97a00b3c1a9a33c3f8049b241657a0464a2ed746305ba24b42c55984fa570 not found: ID does not exist" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.734525 4835 scope.go:117] "RemoveContainer" containerID="08d690cf90c98fd3d92b4d679476ed8c43a53dfe606ef2708a5b0b31c98c7685" Oct 02 11:19:09 crc kubenswrapper[4835]: E1002 11:19:09.735012 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08d690cf90c98fd3d92b4d679476ed8c43a53dfe606ef2708a5b0b31c98c7685\": container with ID starting with 08d690cf90c98fd3d92b4d679476ed8c43a53dfe606ef2708a5b0b31c98c7685 not found: ID does not exist" containerID="08d690cf90c98fd3d92b4d679476ed8c43a53dfe606ef2708a5b0b31c98c7685" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.735074 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08d690cf90c98fd3d92b4d679476ed8c43a53dfe606ef2708a5b0b31c98c7685"} err="failed to get container status \"08d690cf90c98fd3d92b4d679476ed8c43a53dfe606ef2708a5b0b31c98c7685\": rpc error: code = NotFound desc = could not find container \"08d690cf90c98fd3d92b4d679476ed8c43a53dfe606ef2708a5b0b31c98c7685\": container with ID starting with 08d690cf90c98fd3d92b4d679476ed8c43a53dfe606ef2708a5b0b31c98c7685 not found: ID does not exist" Oct 02 11:19:09 crc kubenswrapper[4835]: I1002 11:19:09.739187 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-dsbm5"] Oct 02 11:19:09 crc kubenswrapper[4835]: W1002 11:19:09.740356 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8894eb8_5eee_48da_81e5_1a98616c3a1f.slice/crio-866622bce53fe65b2312ba2645020fe7dc422b1fa0a1350fd81890df1861e4b3 WatchSource:0}: Error finding container 866622bce53fe65b2312ba2645020fe7dc422b1fa0a1350fd81890df1861e4b3: Status 404 returned error can't find the container with id 866622bce53fe65b2312ba2645020fe7dc422b1fa0a1350fd81890df1861e4b3 Oct 02 11:19:10 crc kubenswrapper[4835]: I1002 11:19:10.268496 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ac457e-e0c6-4108-b1f0-4eaae559d4a5" path="/var/lib/kubelet/pods/87ac457e-e0c6-4108-b1f0-4eaae559d4a5/volumes" Oct 02 11:19:10 crc kubenswrapper[4835]: I1002 11:19:10.666285 4835 generic.go:334] "Generic (PLEG): container finished" podID="b8894eb8-5eee-48da-81e5-1a98616c3a1f" containerID="2456a3fc2369098823c87164d5c8a112e3c27881eee3331056b3a0d6572025ab" exitCode=0 Oct 02 11:19:10 crc kubenswrapper[4835]: I1002 11:19:10.666373 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" event={"ID":"b8894eb8-5eee-48da-81e5-1a98616c3a1f","Type":"ContainerDied","Data":"2456a3fc2369098823c87164d5c8a112e3c27881eee3331056b3a0d6572025ab"} Oct 02 11:19:10 crc kubenswrapper[4835]: I1002 11:19:10.666408 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" event={"ID":"b8894eb8-5eee-48da-81e5-1a98616c3a1f","Type":"ContainerStarted","Data":"866622bce53fe65b2312ba2645020fe7dc422b1fa0a1350fd81890df1861e4b3"} Oct 02 11:19:11 crc kubenswrapper[4835]: I1002 11:19:11.679189 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" event={"ID":"b8894eb8-5eee-48da-81e5-1a98616c3a1f","Type":"ContainerStarted","Data":"5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5"} Oct 02 11:19:11 crc kubenswrapper[4835]: I1002 11:19:11.679678 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:11 crc kubenswrapper[4835]: I1002 11:19:11.984005 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:19:11 crc kubenswrapper[4835]: I1002 11:19:11.984093 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.284967 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.316070 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" podStartSLOduration=11.316046112 podStartE2EDuration="11.316046112s" podCreationTimestamp="2025-10-02 11:19:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:19:11.698176239 +0000 UTC m=+1428.258083840" watchObservedRunningTime="2025-10-02 11:19:19.316046112 +0000 UTC m=+1435.875953693" Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.353439 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2g8xw"] Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.353892 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" podUID="fc024136-4d73-471e-b1c6-86229c8264fa" containerName="dnsmasq-dns" containerID="cri-o://484c49d26816d65fc222f1890a428d13caa81de3af992b8148ce1646a1daa389" gracePeriod=10 Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.764184 4835 generic.go:334] "Generic (PLEG): container finished" podID="fc024136-4d73-471e-b1c6-86229c8264fa" containerID="484c49d26816d65fc222f1890a428d13caa81de3af992b8148ce1646a1daa389" exitCode=0 Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.764473 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" event={"ID":"fc024136-4d73-471e-b1c6-86229c8264fa","Type":"ContainerDied","Data":"484c49d26816d65fc222f1890a428d13caa81de3af992b8148ce1646a1daa389"} Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.764650 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" event={"ID":"fc024136-4d73-471e-b1c6-86229c8264fa","Type":"ContainerDied","Data":"d20f042ba6fa7b23b81ac758de2b8c2799b0ef34a985a9a8e478d5a6f6290ff4"} Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.764671 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d20f042ba6fa7b23b81ac758de2b8c2799b0ef34a985a9a8e478d5a6f6290ff4" Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.784114 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.929086 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-config\") pod \"fc024136-4d73-471e-b1c6-86229c8264fa\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.929174 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-nb\") pod \"fc024136-4d73-471e-b1c6-86229c8264fa\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.929332 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-dns-svc\") pod \"fc024136-4d73-471e-b1c6-86229c8264fa\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.929478 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfstn\" (UniqueName: \"kubernetes.io/projected/fc024136-4d73-471e-b1c6-86229c8264fa-kube-api-access-hfstn\") pod \"fc024136-4d73-471e-b1c6-86229c8264fa\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.929520 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-openstack-edpm-ipam\") pod \"fc024136-4d73-471e-b1c6-86229c8264fa\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.929544 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-sb\") pod \"fc024136-4d73-471e-b1c6-86229c8264fa\" (UID: \"fc024136-4d73-471e-b1c6-86229c8264fa\") " Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.934611 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc024136-4d73-471e-b1c6-86229c8264fa-kube-api-access-hfstn" (OuterVolumeSpecName: "kube-api-access-hfstn") pod "fc024136-4d73-471e-b1c6-86229c8264fa" (UID: "fc024136-4d73-471e-b1c6-86229c8264fa"). InnerVolumeSpecName "kube-api-access-hfstn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.991864 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc024136-4d73-471e-b1c6-86229c8264fa" (UID: "fc024136-4d73-471e-b1c6-86229c8264fa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.994350 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc024136-4d73-471e-b1c6-86229c8264fa" (UID: "fc024136-4d73-471e-b1c6-86229c8264fa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:19 crc kubenswrapper[4835]: I1002 11:19:19.998796 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "fc024136-4d73-471e-b1c6-86229c8264fa" (UID: "fc024136-4d73-471e-b1c6-86229c8264fa"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:20 crc kubenswrapper[4835]: I1002 11:19:20.008764 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-config" (OuterVolumeSpecName: "config") pod "fc024136-4d73-471e-b1c6-86229c8264fa" (UID: "fc024136-4d73-471e-b1c6-86229c8264fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:20 crc kubenswrapper[4835]: I1002 11:19:20.032961 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc024136-4d73-471e-b1c6-86229c8264fa" (UID: "fc024136-4d73-471e-b1c6-86229c8264fa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:19:20 crc kubenswrapper[4835]: I1002 11:19:20.034661 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:20 crc kubenswrapper[4835]: I1002 11:19:20.034700 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfstn\" (UniqueName: \"kubernetes.io/projected/fc024136-4d73-471e-b1c6-86229c8264fa-kube-api-access-hfstn\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:20 crc kubenswrapper[4835]: I1002 11:19:20.034717 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:20 crc kubenswrapper[4835]: I1002 11:19:20.034728 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:20 crc kubenswrapper[4835]: I1002 11:19:20.034739 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:20 crc kubenswrapper[4835]: I1002 11:19:20.034749 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc024136-4d73-471e-b1c6-86229c8264fa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:20 crc kubenswrapper[4835]: I1002 11:19:20.771873 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2g8xw" Oct 02 11:19:20 crc kubenswrapper[4835]: I1002 11:19:20.799777 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2g8xw"] Oct 02 11:19:20 crc kubenswrapper[4835]: I1002 11:19:20.810296 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2g8xw"] Oct 02 11:19:22 crc kubenswrapper[4835]: I1002 11:19:22.261753 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc024136-4d73-471e-b1c6-86229c8264fa" path="/var/lib/kubelet/pods/fc024136-4d73-471e-b1c6-86229c8264fa/volumes" Oct 02 11:19:28 crc kubenswrapper[4835]: I1002 11:19:28.855986 4835 generic.go:334] "Generic (PLEG): container finished" podID="743bfb49-5459-4911-8eee-4bb313368c21" containerID="7de499bd43b507a25bdaa838ae06485a956df0d98ac26d9c1471649bd968b1ac" exitCode=0 Oct 02 11:19:28 crc kubenswrapper[4835]: I1002 11:19:28.856110 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"743bfb49-5459-4911-8eee-4bb313368c21","Type":"ContainerDied","Data":"7de499bd43b507a25bdaa838ae06485a956df0d98ac26d9c1471649bd968b1ac"} Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.626574 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw"] Oct 02 11:19:29 crc kubenswrapper[4835]: E1002 11:19:29.627303 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc024136-4d73-471e-b1c6-86229c8264fa" containerName="init" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.627318 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc024136-4d73-471e-b1c6-86229c8264fa" containerName="init" Oct 02 11:19:29 crc kubenswrapper[4835]: E1002 11:19:29.627350 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ac457e-e0c6-4108-b1f0-4eaae559d4a5" containerName="dnsmasq-dns" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.627356 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ac457e-e0c6-4108-b1f0-4eaae559d4a5" containerName="dnsmasq-dns" Oct 02 11:19:29 crc kubenswrapper[4835]: E1002 11:19:29.627366 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc024136-4d73-471e-b1c6-86229c8264fa" containerName="dnsmasq-dns" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.627374 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc024136-4d73-471e-b1c6-86229c8264fa" containerName="dnsmasq-dns" Oct 02 11:19:29 crc kubenswrapper[4835]: E1002 11:19:29.627390 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ac457e-e0c6-4108-b1f0-4eaae559d4a5" containerName="init" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.627397 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ac457e-e0c6-4108-b1f0-4eaae559d4a5" containerName="init" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.627609 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ac457e-e0c6-4108-b1f0-4eaae559d4a5" containerName="dnsmasq-dns" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.627628 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc024136-4d73-471e-b1c6-86229c8264fa" containerName="dnsmasq-dns" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.628335 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.630359 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.635775 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.637610 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.638014 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw"] Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.638623 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.808878 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.808946 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm2tr\" (UniqueName: \"kubernetes.io/projected/7cdc9e59-a224-4be9-9908-3629031c7613-kube-api-access-vm2tr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.809125 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.809259 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.880844 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"743bfb49-5459-4911-8eee-4bb313368c21","Type":"ContainerStarted","Data":"715efc6063c98abb80d0dffe68f254b4fd92508aafb526e67257125f4e895e34"} Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.881332 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.883268 4835 generic.go:334] "Generic (PLEG): container finished" podID="74ecc09a-8044-49d6-8c9b-2cbcc56d9612" containerID="6ee8fec3c6bd26691c3b2a3686e11dae982241a76e21367bbe21ad4d7300033e" exitCode=0 Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.883267 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"74ecc09a-8044-49d6-8c9b-2cbcc56d9612","Type":"ContainerDied","Data":"6ee8fec3c6bd26691c3b2a3686e11dae982241a76e21367bbe21ad4d7300033e"} Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.916937 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.917038 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm2tr\" (UniqueName: \"kubernetes.io/projected/7cdc9e59-a224-4be9-9908-3629031c7613-kube-api-access-vm2tr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.917112 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.917176 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.922240 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.924772 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.924913 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.932428 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.93240616 podStartE2EDuration="36.93240616s" podCreationTimestamp="2025-10-02 11:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:19:29.912480971 +0000 UTC m=+1446.472388572" watchObservedRunningTime="2025-10-02 11:19:29.93240616 +0000 UTC m=+1446.492313741" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.944350 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm2tr\" (UniqueName: \"kubernetes.io/projected/7cdc9e59-a224-4be9-9908-3629031c7613-kube-api-access-vm2tr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:29 crc kubenswrapper[4835]: I1002 11:19:29.950674 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:30 crc kubenswrapper[4835]: I1002 11:19:30.570584 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw"] Oct 02 11:19:30 crc kubenswrapper[4835]: I1002 11:19:30.897629 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" event={"ID":"7cdc9e59-a224-4be9-9908-3629031c7613","Type":"ContainerStarted","Data":"25dfb1c17a99914d9490579bd785a88db18e699a3b75c77c29fe49c97a62154e"} Oct 02 11:19:30 crc kubenswrapper[4835]: I1002 11:19:30.899866 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"74ecc09a-8044-49d6-8c9b-2cbcc56d9612","Type":"ContainerStarted","Data":"74780c113448ecfd8dba038eff1ed01e3358f6e466e9b732442fceaaea30048c"} Oct 02 11:19:30 crc kubenswrapper[4835]: I1002 11:19:30.900360 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:19:30 crc kubenswrapper[4835]: I1002 11:19:30.932186 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.932161743 podStartE2EDuration="36.932161743s" podCreationTimestamp="2025-10-02 11:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:19:30.923069269 +0000 UTC m=+1447.482976850" watchObservedRunningTime="2025-10-02 11:19:30.932161743 +0000 UTC m=+1447.492069324" Oct 02 11:19:41 crc kubenswrapper[4835]: I1002 11:19:41.009585 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" event={"ID":"7cdc9e59-a224-4be9-9908-3629031c7613","Type":"ContainerStarted","Data":"1eb230f23a7ec5033317bf72a75b47e609265a7a840e6ec77ad4f14f41b05d03"} Oct 02 11:19:41 crc kubenswrapper[4835]: I1002 11:19:41.984529 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:19:41 crc kubenswrapper[4835]: I1002 11:19:41.984612 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:19:43 crc kubenswrapper[4835]: I1002 11:19:43.876569 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 11:19:43 crc kubenswrapper[4835]: I1002 11:19:43.907994 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" podStartSLOduration=5.158489967 podStartE2EDuration="14.907965347s" podCreationTimestamp="2025-10-02 11:19:29 +0000 UTC" firstStartedPulling="2025-10-02 11:19:30.571123641 +0000 UTC m=+1447.131031222" lastFinishedPulling="2025-10-02 11:19:40.320599031 +0000 UTC m=+1456.880506602" observedRunningTime="2025-10-02 11:19:41.028294448 +0000 UTC m=+1457.588202079" watchObservedRunningTime="2025-10-02 11:19:43.907965347 +0000 UTC m=+1460.467872928" Oct 02 11:19:44 crc kubenswrapper[4835]: I1002 11:19:44.913552 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:19:52 crc kubenswrapper[4835]: I1002 11:19:52.103883 4835 generic.go:334] "Generic (PLEG): container finished" podID="7cdc9e59-a224-4be9-9908-3629031c7613" containerID="1eb230f23a7ec5033317bf72a75b47e609265a7a840e6ec77ad4f14f41b05d03" exitCode=0 Oct 02 11:19:52 crc kubenswrapper[4835]: I1002 11:19:52.103973 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" event={"ID":"7cdc9e59-a224-4be9-9908-3629031c7613","Type":"ContainerDied","Data":"1eb230f23a7ec5033317bf72a75b47e609265a7a840e6ec77ad4f14f41b05d03"} Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.594095 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.695136 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-inventory\") pod \"7cdc9e59-a224-4be9-9908-3629031c7613\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.695204 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm2tr\" (UniqueName: \"kubernetes.io/projected/7cdc9e59-a224-4be9-9908-3629031c7613-kube-api-access-vm2tr\") pod \"7cdc9e59-a224-4be9-9908-3629031c7613\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.696483 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-repo-setup-combined-ca-bundle\") pod \"7cdc9e59-a224-4be9-9908-3629031c7613\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.696514 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-ssh-key\") pod \"7cdc9e59-a224-4be9-9908-3629031c7613\" (UID: \"7cdc9e59-a224-4be9-9908-3629031c7613\") " Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.702877 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cdc9e59-a224-4be9-9908-3629031c7613-kube-api-access-vm2tr" (OuterVolumeSpecName: "kube-api-access-vm2tr") pod "7cdc9e59-a224-4be9-9908-3629031c7613" (UID: "7cdc9e59-a224-4be9-9908-3629031c7613"). InnerVolumeSpecName "kube-api-access-vm2tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.703012 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7cdc9e59-a224-4be9-9908-3629031c7613" (UID: "7cdc9e59-a224-4be9-9908-3629031c7613"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.732835 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-inventory" (OuterVolumeSpecName: "inventory") pod "7cdc9e59-a224-4be9-9908-3629031c7613" (UID: "7cdc9e59-a224-4be9-9908-3629031c7613"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.744635 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7cdc9e59-a224-4be9-9908-3629031c7613" (UID: "7cdc9e59-a224-4be9-9908-3629031c7613"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.798664 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.798692 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm2tr\" (UniqueName: \"kubernetes.io/projected/7cdc9e59-a224-4be9-9908-3629031c7613-kube-api-access-vm2tr\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.798704 4835 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:53 crc kubenswrapper[4835]: I1002 11:19:53.798712 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7cdc9e59-a224-4be9-9908-3629031c7613-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.126153 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" event={"ID":"7cdc9e59-a224-4be9-9908-3629031c7613","Type":"ContainerDied","Data":"25dfb1c17a99914d9490579bd785a88db18e699a3b75c77c29fe49c97a62154e"} Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.126549 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25dfb1c17a99914d9490579bd785a88db18e699a3b75c77c29fe49c97a62154e" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.126192 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.210685 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59"] Oct 02 11:19:54 crc kubenswrapper[4835]: E1002 11:19:54.211116 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cdc9e59-a224-4be9-9908-3629031c7613" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.211136 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cdc9e59-a224-4be9-9908-3629031c7613" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.211382 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cdc9e59-a224-4be9-9908-3629031c7613" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.212009 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.213603 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.213945 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.214338 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.214369 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.230542 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59"] Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.307187 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.307372 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.307435 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24sh8\" (UniqueName: \"kubernetes.io/projected/62f12acc-2a68-4f9f-bde2-c223c102bf2a-kube-api-access-24sh8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.307464 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.410018 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24sh8\" (UniqueName: \"kubernetes.io/projected/62f12acc-2a68-4f9f-bde2-c223c102bf2a-kube-api-access-24sh8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.410084 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.410161 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.410263 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.414117 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.414313 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.415043 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.429851 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24sh8\" (UniqueName: \"kubernetes.io/projected/62f12acc-2a68-4f9f-bde2-c223c102bf2a-kube-api-access-24sh8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:54 crc kubenswrapper[4835]: I1002 11:19:54.547610 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:19:55 crc kubenswrapper[4835]: I1002 11:19:55.077694 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59"] Oct 02 11:19:55 crc kubenswrapper[4835]: I1002 11:19:55.139552 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" event={"ID":"62f12acc-2a68-4f9f-bde2-c223c102bf2a","Type":"ContainerStarted","Data":"873fe585ea52371cd0c1b2c4e06b89108f52b2c34e4f0b8dd22d1267a3b6dbac"} Oct 02 11:19:56 crc kubenswrapper[4835]: I1002 11:19:56.018636 4835 scope.go:117] "RemoveContainer" containerID="dda36427274bde044185c3e10000bd1e81dd71ad32c6fd08b776a774e9aa5cce" Oct 02 11:19:56 crc kubenswrapper[4835]: I1002 11:19:56.083497 4835 scope.go:117] "RemoveContainer" containerID="78906bfb21ff2ad9bf11a249f1055013fb0b8cdc3afb343be8d537a89c9cc7ff" Oct 02 11:19:56 crc kubenswrapper[4835]: I1002 11:19:56.145347 4835 scope.go:117] "RemoveContainer" containerID="bbc857a30cd84653eda4005289281b2b9f1c274616a77d8554e672d9b98bb02d" Oct 02 11:19:57 crc kubenswrapper[4835]: I1002 11:19:57.172981 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" event={"ID":"62f12acc-2a68-4f9f-bde2-c223c102bf2a","Type":"ContainerStarted","Data":"5de9a334b11cbf9d8419906fd6d893e7e8ef0f220847f058c2dce429ac45bb89"} Oct 02 11:19:57 crc kubenswrapper[4835]: I1002 11:19:57.190445 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" podStartSLOduration=2.359241183 podStartE2EDuration="3.190421662s" podCreationTimestamp="2025-10-02 11:19:54 +0000 UTC" firstStartedPulling="2025-10-02 11:19:55.086281113 +0000 UTC m=+1471.646188694" lastFinishedPulling="2025-10-02 11:19:55.917461592 +0000 UTC m=+1472.477369173" observedRunningTime="2025-10-02 11:19:57.186672113 +0000 UTC m=+1473.746579724" watchObservedRunningTime="2025-10-02 11:19:57.190421662 +0000 UTC m=+1473.750329243" Oct 02 11:20:11 crc kubenswrapper[4835]: I1002 11:20:11.984802 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:20:11 crc kubenswrapper[4835]: I1002 11:20:11.985512 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:20:11 crc kubenswrapper[4835]: I1002 11:20:11.985581 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:20:11 crc kubenswrapper[4835]: I1002 11:20:11.986618 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74bda1206c6cef4b94a808d597fd0b18bc43e5697e9459d0f58f3237db138b7b"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:20:11 crc kubenswrapper[4835]: I1002 11:20:11.986690 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://74bda1206c6cef4b94a808d597fd0b18bc43e5697e9459d0f58f3237db138b7b" gracePeriod=600 Oct 02 11:20:12 crc kubenswrapper[4835]: I1002 11:20:12.346475 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="74bda1206c6cef4b94a808d597fd0b18bc43e5697e9459d0f58f3237db138b7b" exitCode=0 Oct 02 11:20:12 crc kubenswrapper[4835]: I1002 11:20:12.346711 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"74bda1206c6cef4b94a808d597fd0b18bc43e5697e9459d0f58f3237db138b7b"} Oct 02 11:20:12 crc kubenswrapper[4835]: I1002 11:20:12.346836 4835 scope.go:117] "RemoveContainer" containerID="ece753eadbbcedfb37995a0897d789fdf5c6660566a042ca4a738c29f1789c89" Oct 02 11:20:13 crc kubenswrapper[4835]: I1002 11:20:13.356143 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a"} Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.245392 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr5j"] Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.250195 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.264730 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr5j"] Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.393503 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-catalog-content\") pod \"redhat-marketplace-fgr5j\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.393556 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjvs7\" (UniqueName: \"kubernetes.io/projected/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-kube-api-access-xjvs7\") pod \"redhat-marketplace-fgr5j\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.393588 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-utilities\") pod \"redhat-marketplace-fgr5j\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.496044 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-catalog-content\") pod \"redhat-marketplace-fgr5j\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.496096 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjvs7\" (UniqueName: \"kubernetes.io/projected/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-kube-api-access-xjvs7\") pod \"redhat-marketplace-fgr5j\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.496119 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-utilities\") pod \"redhat-marketplace-fgr5j\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.496554 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-catalog-content\") pod \"redhat-marketplace-fgr5j\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.496623 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-utilities\") pod \"redhat-marketplace-fgr5j\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.514961 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjvs7\" (UniqueName: \"kubernetes.io/projected/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-kube-api-access-xjvs7\") pod \"redhat-marketplace-fgr5j\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:00 crc kubenswrapper[4835]: I1002 11:21:00.580110 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:01 crc kubenswrapper[4835]: I1002 11:21:01.038728 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr5j"] Oct 02 11:21:01 crc kubenswrapper[4835]: I1002 11:21:01.802213 4835 generic.go:334] "Generic (PLEG): container finished" podID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" containerID="005e30f1e8642a18c0c940b0ec4cadfcc73828281b1bd68a9230dab497d27334" exitCode=0 Oct 02 11:21:01 crc kubenswrapper[4835]: I1002 11:21:01.802334 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr5j" event={"ID":"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6","Type":"ContainerDied","Data":"005e30f1e8642a18c0c940b0ec4cadfcc73828281b1bd68a9230dab497d27334"} Oct 02 11:21:01 crc kubenswrapper[4835]: I1002 11:21:01.802562 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr5j" event={"ID":"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6","Type":"ContainerStarted","Data":"ed1567efe6d26e62653991887c004cd004ab503fbe73009041a30086fc3de7e9"} Oct 02 11:21:02 crc kubenswrapper[4835]: I1002 11:21:02.813170 4835 generic.go:334] "Generic (PLEG): container finished" podID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" containerID="e9a8094ef407694617c99c18d184b478b6df0f1f2825cfc8c3cbe9e836c5a3eb" exitCode=0 Oct 02 11:21:02 crc kubenswrapper[4835]: I1002 11:21:02.813235 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr5j" event={"ID":"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6","Type":"ContainerDied","Data":"e9a8094ef407694617c99c18d184b478b6df0f1f2825cfc8c3cbe9e836c5a3eb"} Oct 02 11:21:03 crc kubenswrapper[4835]: I1002 11:21:03.839600 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr5j" event={"ID":"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6","Type":"ContainerStarted","Data":"4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2"} Oct 02 11:21:03 crc kubenswrapper[4835]: I1002 11:21:03.867268 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fgr5j" podStartSLOduration=2.281681999 podStartE2EDuration="3.867245566s" podCreationTimestamp="2025-10-02 11:21:00 +0000 UTC" firstStartedPulling="2025-10-02 11:21:01.804194193 +0000 UTC m=+1538.364101774" lastFinishedPulling="2025-10-02 11:21:03.38975776 +0000 UTC m=+1539.949665341" observedRunningTime="2025-10-02 11:21:03.856285668 +0000 UTC m=+1540.416193269" watchObservedRunningTime="2025-10-02 11:21:03.867245566 +0000 UTC m=+1540.427153147" Oct 02 11:21:10 crc kubenswrapper[4835]: I1002 11:21:10.580507 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:10 crc kubenswrapper[4835]: I1002 11:21:10.581091 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:10 crc kubenswrapper[4835]: I1002 11:21:10.655323 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:10 crc kubenswrapper[4835]: I1002 11:21:10.978154 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:11 crc kubenswrapper[4835]: I1002 11:21:11.038346 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr5j"] Oct 02 11:21:12 crc kubenswrapper[4835]: I1002 11:21:12.923693 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fgr5j" podUID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" containerName="registry-server" containerID="cri-o://4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2" gracePeriod=2 Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.385402 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.461924 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-catalog-content\") pod \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.462322 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjvs7\" (UniqueName: \"kubernetes.io/projected/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-kube-api-access-xjvs7\") pod \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.462477 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-utilities\") pod \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\" (UID: \"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6\") " Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.463148 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-utilities" (OuterVolumeSpecName: "utilities") pod "60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" (UID: "60d4459f-18a8-4a4a-87ad-2d1399e9a3d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.467886 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-kube-api-access-xjvs7" (OuterVolumeSpecName: "kube-api-access-xjvs7") pod "60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" (UID: "60d4459f-18a8-4a4a-87ad-2d1399e9a3d6"). InnerVolumeSpecName "kube-api-access-xjvs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.475631 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" (UID: "60d4459f-18a8-4a4a-87ad-2d1399e9a3d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.565558 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjvs7\" (UniqueName: \"kubernetes.io/projected/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-kube-api-access-xjvs7\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.565628 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.565642 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.938326 4835 generic.go:334] "Generic (PLEG): container finished" podID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" containerID="4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2" exitCode=0 Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.938394 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr5j" event={"ID":"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6","Type":"ContainerDied","Data":"4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2"} Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.938449 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgr5j" Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.938485 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr5j" event={"ID":"60d4459f-18a8-4a4a-87ad-2d1399e9a3d6","Type":"ContainerDied","Data":"ed1567efe6d26e62653991887c004cd004ab503fbe73009041a30086fc3de7e9"} Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.938518 4835 scope.go:117] "RemoveContainer" containerID="4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2" Oct 02 11:21:13 crc kubenswrapper[4835]: I1002 11:21:13.971878 4835 scope.go:117] "RemoveContainer" containerID="e9a8094ef407694617c99c18d184b478b6df0f1f2825cfc8c3cbe9e836c5a3eb" Oct 02 11:21:14 crc kubenswrapper[4835]: I1002 11:21:13.999963 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr5j"] Oct 02 11:21:14 crc kubenswrapper[4835]: I1002 11:21:14.006447 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr5j"] Oct 02 11:21:14 crc kubenswrapper[4835]: I1002 11:21:14.016612 4835 scope.go:117] "RemoveContainer" containerID="005e30f1e8642a18c0c940b0ec4cadfcc73828281b1bd68a9230dab497d27334" Oct 02 11:21:14 crc kubenswrapper[4835]: I1002 11:21:14.065360 4835 scope.go:117] "RemoveContainer" containerID="4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2" Oct 02 11:21:14 crc kubenswrapper[4835]: E1002 11:21:14.065796 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2\": container with ID starting with 4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2 not found: ID does not exist" containerID="4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2" Oct 02 11:21:14 crc kubenswrapper[4835]: I1002 11:21:14.065850 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2"} err="failed to get container status \"4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2\": rpc error: code = NotFound desc = could not find container \"4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2\": container with ID starting with 4145ef850a8a390831b8d988b3429ca63448b0c52678fff2d38a4850b1aa2ff2 not found: ID does not exist" Oct 02 11:21:14 crc kubenswrapper[4835]: I1002 11:21:14.065886 4835 scope.go:117] "RemoveContainer" containerID="e9a8094ef407694617c99c18d184b478b6df0f1f2825cfc8c3cbe9e836c5a3eb" Oct 02 11:21:14 crc kubenswrapper[4835]: E1002 11:21:14.066251 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a8094ef407694617c99c18d184b478b6df0f1f2825cfc8c3cbe9e836c5a3eb\": container with ID starting with e9a8094ef407694617c99c18d184b478b6df0f1f2825cfc8c3cbe9e836c5a3eb not found: ID does not exist" containerID="e9a8094ef407694617c99c18d184b478b6df0f1f2825cfc8c3cbe9e836c5a3eb" Oct 02 11:21:14 crc kubenswrapper[4835]: I1002 11:21:14.066283 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a8094ef407694617c99c18d184b478b6df0f1f2825cfc8c3cbe9e836c5a3eb"} err="failed to get container status \"e9a8094ef407694617c99c18d184b478b6df0f1f2825cfc8c3cbe9e836c5a3eb\": rpc error: code = NotFound desc = could not find container \"e9a8094ef407694617c99c18d184b478b6df0f1f2825cfc8c3cbe9e836c5a3eb\": container with ID starting with e9a8094ef407694617c99c18d184b478b6df0f1f2825cfc8c3cbe9e836c5a3eb not found: ID does not exist" Oct 02 11:21:14 crc kubenswrapper[4835]: I1002 11:21:14.066311 4835 scope.go:117] "RemoveContainer" containerID="005e30f1e8642a18c0c940b0ec4cadfcc73828281b1bd68a9230dab497d27334" Oct 02 11:21:14 crc kubenswrapper[4835]: E1002 11:21:14.066608 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005e30f1e8642a18c0c940b0ec4cadfcc73828281b1bd68a9230dab497d27334\": container with ID starting with 005e30f1e8642a18c0c940b0ec4cadfcc73828281b1bd68a9230dab497d27334 not found: ID does not exist" containerID="005e30f1e8642a18c0c940b0ec4cadfcc73828281b1bd68a9230dab497d27334" Oct 02 11:21:14 crc kubenswrapper[4835]: I1002 11:21:14.066634 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005e30f1e8642a18c0c940b0ec4cadfcc73828281b1bd68a9230dab497d27334"} err="failed to get container status \"005e30f1e8642a18c0c940b0ec4cadfcc73828281b1bd68a9230dab497d27334\": rpc error: code = NotFound desc = could not find container \"005e30f1e8642a18c0c940b0ec4cadfcc73828281b1bd68a9230dab497d27334\": container with ID starting with 005e30f1e8642a18c0c940b0ec4cadfcc73828281b1bd68a9230dab497d27334 not found: ID does not exist" Oct 02 11:21:14 crc kubenswrapper[4835]: I1002 11:21:14.263897 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" path="/var/lib/kubelet/pods/60d4459f-18a8-4a4a-87ad-2d1399e9a3d6/volumes" Oct 02 11:22:41 crc kubenswrapper[4835]: I1002 11:22:41.984338 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:22:41 crc kubenswrapper[4835]: I1002 11:22:41.984955 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:22:56 crc kubenswrapper[4835]: I1002 11:22:56.358856 4835 scope.go:117] "RemoveContainer" containerID="b3754b5f33119b5d94162cdee22794bc3130c3996f67195a78f0dfc70fa52cb4" Oct 02 11:22:56 crc kubenswrapper[4835]: I1002 11:22:56.391837 4835 scope.go:117] "RemoveContainer" containerID="28f4c47a410fca20364d42e73a3e576420f90c71d9d5dd811a18a67820f3ac32" Oct 02 11:22:59 crc kubenswrapper[4835]: I1002 11:22:59.898595 4835 generic.go:334] "Generic (PLEG): container finished" podID="62f12acc-2a68-4f9f-bde2-c223c102bf2a" containerID="5de9a334b11cbf9d8419906fd6d893e7e8ef0f220847f058c2dce429ac45bb89" exitCode=0 Oct 02 11:22:59 crc kubenswrapper[4835]: I1002 11:22:59.898702 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" event={"ID":"62f12acc-2a68-4f9f-bde2-c223c102bf2a","Type":"ContainerDied","Data":"5de9a334b11cbf9d8419906fd6d893e7e8ef0f220847f058c2dce429ac45bb89"} Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.305088 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.404969 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-bootstrap-combined-ca-bundle\") pod \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.405405 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-ssh-key\") pod \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.405586 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24sh8\" (UniqueName: \"kubernetes.io/projected/62f12acc-2a68-4f9f-bde2-c223c102bf2a-kube-api-access-24sh8\") pod \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.405726 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-inventory\") pod \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\" (UID: \"62f12acc-2a68-4f9f-bde2-c223c102bf2a\") " Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.411826 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "62f12acc-2a68-4f9f-bde2-c223c102bf2a" (UID: "62f12acc-2a68-4f9f-bde2-c223c102bf2a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.411896 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f12acc-2a68-4f9f-bde2-c223c102bf2a-kube-api-access-24sh8" (OuterVolumeSpecName: "kube-api-access-24sh8") pod "62f12acc-2a68-4f9f-bde2-c223c102bf2a" (UID: "62f12acc-2a68-4f9f-bde2-c223c102bf2a"). InnerVolumeSpecName "kube-api-access-24sh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.434108 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "62f12acc-2a68-4f9f-bde2-c223c102bf2a" (UID: "62f12acc-2a68-4f9f-bde2-c223c102bf2a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.438761 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-inventory" (OuterVolumeSpecName: "inventory") pod "62f12acc-2a68-4f9f-bde2-c223c102bf2a" (UID: "62f12acc-2a68-4f9f-bde2-c223c102bf2a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.509238 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.509297 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24sh8\" (UniqueName: \"kubernetes.io/projected/62f12acc-2a68-4f9f-bde2-c223c102bf2a-kube-api-access-24sh8\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.509309 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.509318 4835 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f12acc-2a68-4f9f-bde2-c223c102bf2a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.921446 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" event={"ID":"62f12acc-2a68-4f9f-bde2-c223c102bf2a","Type":"ContainerDied","Data":"873fe585ea52371cd0c1b2c4e06b89108f52b2c34e4f0b8dd22d1267a3b6dbac"} Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.921749 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="873fe585ea52371cd0c1b2c4e06b89108f52b2c34e4f0b8dd22d1267a3b6dbac" Oct 02 11:23:01 crc kubenswrapper[4835]: I1002 11:23:01.921532 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.003771 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx"] Oct 02 11:23:02 crc kubenswrapper[4835]: E1002 11:23:02.004138 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" containerName="registry-server" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.004153 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" containerName="registry-server" Oct 02 11:23:02 crc kubenswrapper[4835]: E1002 11:23:02.004170 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" containerName="extract-content" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.004175 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" containerName="extract-content" Oct 02 11:23:02 crc kubenswrapper[4835]: E1002 11:23:02.004188 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f12acc-2a68-4f9f-bde2-c223c102bf2a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.004195 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f12acc-2a68-4f9f-bde2-c223c102bf2a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 11:23:02 crc kubenswrapper[4835]: E1002 11:23:02.004208 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" containerName="extract-utilities" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.004214 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" containerName="extract-utilities" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.004410 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f12acc-2a68-4f9f-bde2-c223c102bf2a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.004423 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d4459f-18a8-4a4a-87ad-2d1399e9a3d6" containerName="registry-server" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.005139 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.007130 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.007416 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.007351 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.008019 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.021926 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx"] Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.119487 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.119550 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.119697 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6b88\" (UniqueName: \"kubernetes.io/projected/2ed489f8-6c30-403d-8634-2c67229b4114-kube-api-access-p6b88\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.222429 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6b88\" (UniqueName: \"kubernetes.io/projected/2ed489f8-6c30-403d-8634-2c67229b4114-kube-api-access-p6b88\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.222645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.222678 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.236562 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.236758 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.241444 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6b88\" (UniqueName: \"kubernetes.io/projected/2ed489f8-6c30-403d-8634-2c67229b4114-kube-api-access-p6b88\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.323973 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.839020 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx"] Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.845852 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:23:02 crc kubenswrapper[4835]: I1002 11:23:02.931874 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" event={"ID":"2ed489f8-6c30-403d-8634-2c67229b4114","Type":"ContainerStarted","Data":"3380702e46c3ce15895ac88934ca952d4a74b741d8adeb29a7458bdd24d60b79"} Oct 02 11:23:04 crc kubenswrapper[4835]: I1002 11:23:04.961649 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" event={"ID":"2ed489f8-6c30-403d-8634-2c67229b4114","Type":"ContainerStarted","Data":"acac1ce3832e8a4ea0cff66ecd507902e9ecd7896984dfd99b4dc495c33fa7de"} Oct 02 11:23:04 crc kubenswrapper[4835]: I1002 11:23:04.982043 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" podStartSLOduration=3.029961154 podStartE2EDuration="3.982030882s" podCreationTimestamp="2025-10-02 11:23:01 +0000 UTC" firstStartedPulling="2025-10-02 11:23:02.845606512 +0000 UTC m=+1659.405514093" lastFinishedPulling="2025-10-02 11:23:03.79767624 +0000 UTC m=+1660.357583821" observedRunningTime="2025-10-02 11:23:04.97817286 +0000 UTC m=+1661.538080461" watchObservedRunningTime="2025-10-02 11:23:04.982030882 +0000 UTC m=+1661.541938463" Oct 02 11:23:07 crc kubenswrapper[4835]: I1002 11:23:07.042138 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-wbhtf"] Oct 02 11:23:07 crc kubenswrapper[4835]: I1002 11:23:07.052555 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-7qlnx"] Oct 02 11:23:07 crc kubenswrapper[4835]: I1002 11:23:07.059624 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-wbhtf"] Oct 02 11:23:07 crc kubenswrapper[4835]: I1002 11:23:07.067028 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-7qlnx"] Oct 02 11:23:08 crc kubenswrapper[4835]: I1002 11:23:08.034650 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8lvj9"] Oct 02 11:23:08 crc kubenswrapper[4835]: I1002 11:23:08.044465 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8lvj9"] Oct 02 11:23:08 crc kubenswrapper[4835]: I1002 11:23:08.262055 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc00dc1-89cb-437a-a52a-013dbede6dee" path="/var/lib/kubelet/pods/6bc00dc1-89cb-437a-a52a-013dbede6dee/volumes" Oct 02 11:23:08 crc kubenswrapper[4835]: I1002 11:23:08.262751 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a036d6a1-0e89-4e48-af67-ed924271e652" path="/var/lib/kubelet/pods/a036d6a1-0e89-4e48-af67-ed924271e652/volumes" Oct 02 11:23:08 crc kubenswrapper[4835]: I1002 11:23:08.263277 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb816f8-4d3b-441d-acf1-7f0cf9828759" path="/var/lib/kubelet/pods/feb816f8-4d3b-441d-acf1-7f0cf9828759/volumes" Oct 02 11:23:11 crc kubenswrapper[4835]: I1002 11:23:11.984572 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:23:11 crc kubenswrapper[4835]: I1002 11:23:11.984948 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:23:17 crc kubenswrapper[4835]: I1002 11:23:17.037307 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1832-account-create-8gktd"] Oct 02 11:23:17 crc kubenswrapper[4835]: I1002 11:23:17.048093 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-341d-account-create-vdzq8"] Oct 02 11:23:17 crc kubenswrapper[4835]: I1002 11:23:17.060728 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-341d-account-create-vdzq8"] Oct 02 11:23:17 crc kubenswrapper[4835]: I1002 11:23:17.068364 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1832-account-create-8gktd"] Oct 02 11:23:18 crc kubenswrapper[4835]: I1002 11:23:18.024170 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-af88-account-create-7tv6x"] Oct 02 11:23:18 crc kubenswrapper[4835]: I1002 11:23:18.031055 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-af88-account-create-7tv6x"] Oct 02 11:23:18 crc kubenswrapper[4835]: I1002 11:23:18.261885 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ce56b89-22d8-42bc-badc-0da2fd73cb25" path="/var/lib/kubelet/pods/1ce56b89-22d8-42bc-badc-0da2fd73cb25/volumes" Oct 02 11:23:18 crc kubenswrapper[4835]: I1002 11:23:18.262520 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c" path="/var/lib/kubelet/pods/38b8ad1d-8591-4ee4-8bfd-1d40e7e6158c/volumes" Oct 02 11:23:18 crc kubenswrapper[4835]: I1002 11:23:18.263022 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe3b0b0-65db-451b-afc6-af7628c749a2" path="/var/lib/kubelet/pods/cfe3b0b0-65db-451b-afc6-af7628c749a2/volumes" Oct 02 11:23:40 crc kubenswrapper[4835]: I1002 11:23:40.043695 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-9gkdm"] Oct 02 11:23:40 crc kubenswrapper[4835]: I1002 11:23:40.116044 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ntmw9"] Oct 02 11:23:40 crc kubenswrapper[4835]: I1002 11:23:40.129295 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-bsb2r"] Oct 02 11:23:40 crc kubenswrapper[4835]: I1002 11:23:40.138448 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-9gkdm"] Oct 02 11:23:40 crc kubenswrapper[4835]: I1002 11:23:40.151851 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-bsb2r"] Oct 02 11:23:40 crc kubenswrapper[4835]: I1002 11:23:40.162403 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ntmw9"] Oct 02 11:23:40 crc kubenswrapper[4835]: I1002 11:23:40.263706 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90716226-5bad-4b04-92d3-b3efdd7efd6d" path="/var/lib/kubelet/pods/90716226-5bad-4b04-92d3-b3efdd7efd6d/volumes" Oct 02 11:23:40 crc kubenswrapper[4835]: I1002 11:23:40.264388 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca1fdbc-9f7c-43fd-8bd3-812b87bbd432" path="/var/lib/kubelet/pods/aca1fdbc-9f7c-43fd-8bd3-812b87bbd432/volumes" Oct 02 11:23:40 crc kubenswrapper[4835]: I1002 11:23:40.264978 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638" path="/var/lib/kubelet/pods/caaeb6a6-bdf2-48b9-9df8-0a2f5cb6e638/volumes" Oct 02 11:23:41 crc kubenswrapper[4835]: I1002 11:23:41.984625 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:23:41 crc kubenswrapper[4835]: I1002 11:23:41.986402 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:23:41 crc kubenswrapper[4835]: I1002 11:23:41.986505 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:23:41 crc kubenswrapper[4835]: I1002 11:23:41.987264 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:23:41 crc kubenswrapper[4835]: I1002 11:23:41.987347 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" gracePeriod=600 Oct 02 11:23:42 crc kubenswrapper[4835]: E1002 11:23:42.115047 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:23:42 crc kubenswrapper[4835]: I1002 11:23:42.288750 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" exitCode=0 Oct 02 11:23:42 crc kubenswrapper[4835]: I1002 11:23:42.288927 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a"} Oct 02 11:23:42 crc kubenswrapper[4835]: I1002 11:23:42.289255 4835 scope.go:117] "RemoveContainer" containerID="74bda1206c6cef4b94a808d597fd0b18bc43e5697e9459d0f58f3237db138b7b" Oct 02 11:23:42 crc kubenswrapper[4835]: I1002 11:23:42.290017 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:23:42 crc kubenswrapper[4835]: E1002 11:23:42.290345 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:23:45 crc kubenswrapper[4835]: I1002 11:23:45.054760 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b95b-account-create-jlqb5"] Oct 02 11:23:45 crc kubenswrapper[4835]: I1002 11:23:45.062009 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-hbr8l"] Oct 02 11:23:45 crc kubenswrapper[4835]: I1002 11:23:45.070786 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b95b-account-create-jlqb5"] Oct 02 11:23:45 crc kubenswrapper[4835]: I1002 11:23:45.078710 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-hbr8l"] Oct 02 11:23:46 crc kubenswrapper[4835]: I1002 11:23:46.028111 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-7zkdc"] Oct 02 11:23:46 crc kubenswrapper[4835]: I1002 11:23:46.036992 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-7zkdc"] Oct 02 11:23:46 crc kubenswrapper[4835]: I1002 11:23:46.263744 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c1e8272-6be9-43d5-99e6-571b3d2a5ba1" path="/var/lib/kubelet/pods/2c1e8272-6be9-43d5-99e6-571b3d2a5ba1/volumes" Oct 02 11:23:46 crc kubenswrapper[4835]: I1002 11:23:46.264501 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92bcb867-7f39-415a-8565-027fa8d2963e" path="/var/lib/kubelet/pods/92bcb867-7f39-415a-8565-027fa8d2963e/volumes" Oct 02 11:23:46 crc kubenswrapper[4835]: I1002 11:23:46.265431 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9246fcb-4999-4c04-8d46-729b57a896ef" path="/var/lib/kubelet/pods/d9246fcb-4999-4c04-8d46-729b57a896ef/volumes" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.297504 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-94svm"] Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.300911 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.304675 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-94svm"] Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.403549 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rs96\" (UniqueName: \"kubernetes.io/projected/51aa88a4-9464-47d2-855f-d7865545c0eb-kube-api-access-4rs96\") pod \"redhat-operators-94svm\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.403696 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-utilities\") pod \"redhat-operators-94svm\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.403740 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-catalog-content\") pod \"redhat-operators-94svm\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.483546 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n4k8l"] Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.485299 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.504325 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4k8l"] Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.505540 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z49v5\" (UniqueName: \"kubernetes.io/projected/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-kube-api-access-z49v5\") pod \"community-operators-n4k8l\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.505597 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-utilities\") pod \"redhat-operators-94svm\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.505648 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-catalog-content\") pod \"redhat-operators-94svm\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.505723 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-utilities\") pod \"community-operators-n4k8l\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.505749 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rs96\" (UniqueName: \"kubernetes.io/projected/51aa88a4-9464-47d2-855f-d7865545c0eb-kube-api-access-4rs96\") pod \"redhat-operators-94svm\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.505850 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-catalog-content\") pod \"community-operators-n4k8l\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.506306 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-catalog-content\") pod \"redhat-operators-94svm\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.506610 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-utilities\") pod \"redhat-operators-94svm\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.537156 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rs96\" (UniqueName: \"kubernetes.io/projected/51aa88a4-9464-47d2-855f-d7865545c0eb-kube-api-access-4rs96\") pod \"redhat-operators-94svm\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.606382 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z49v5\" (UniqueName: \"kubernetes.io/projected/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-kube-api-access-z49v5\") pod \"community-operators-n4k8l\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.606498 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-utilities\") pod \"community-operators-n4k8l\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.606570 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-catalog-content\") pod \"community-operators-n4k8l\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.607055 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-utilities\") pod \"community-operators-n4k8l\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.607091 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-catalog-content\") pod \"community-operators-n4k8l\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.624942 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z49v5\" (UniqueName: \"kubernetes.io/projected/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-kube-api-access-z49v5\") pod \"community-operators-n4k8l\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.650331 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:23:53 crc kubenswrapper[4835]: I1002 11:23:53.806594 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:23:54 crc kubenswrapper[4835]: I1002 11:23:54.216543 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-94svm"] Oct 02 11:23:54 crc kubenswrapper[4835]: I1002 11:23:54.408955 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4k8l"] Oct 02 11:23:54 crc kubenswrapper[4835]: I1002 11:23:54.413321 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94svm" event={"ID":"51aa88a4-9464-47d2-855f-d7865545c0eb","Type":"ContainerStarted","Data":"10c776646d4d392cc4ff43aa6ea61ecc6a12707bad82124c850b249ed57bcc11"} Oct 02 11:23:54 crc kubenswrapper[4835]: W1002 11:23:54.416967 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd8b932_c11e_4914_b7f8_bb5b1f13e9bc.slice/crio-9c6e09a889147c1a7a188c60fe44b76aff0a054650cf259dda5a4c5e557d4e52 WatchSource:0}: Error finding container 9c6e09a889147c1a7a188c60fe44b76aff0a054650cf259dda5a4c5e557d4e52: Status 404 returned error can't find the container with id 9c6e09a889147c1a7a188c60fe44b76aff0a054650cf259dda5a4c5e557d4e52 Oct 02 11:23:54 crc kubenswrapper[4835]: E1002 11:23:54.816810 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd8b932_c11e_4914_b7f8_bb5b1f13e9bc.slice/crio-conmon-89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd8b932_c11e_4914_b7f8_bb5b1f13e9bc.slice/crio-89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:23:55 crc kubenswrapper[4835]: I1002 11:23:55.258845 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:23:55 crc kubenswrapper[4835]: E1002 11:23:55.259703 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:23:55 crc kubenswrapper[4835]: I1002 11:23:55.424980 4835 generic.go:334] "Generic (PLEG): container finished" podID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerID="1e8e7a6f06e32eea0990fda1433d2694237f40a8999b98748c4d3f79dae7ce95" exitCode=0 Oct 02 11:23:55 crc kubenswrapper[4835]: I1002 11:23:55.425090 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94svm" event={"ID":"51aa88a4-9464-47d2-855f-d7865545c0eb","Type":"ContainerDied","Data":"1e8e7a6f06e32eea0990fda1433d2694237f40a8999b98748c4d3f79dae7ce95"} Oct 02 11:23:55 crc kubenswrapper[4835]: I1002 11:23:55.426810 4835 generic.go:334] "Generic (PLEG): container finished" podID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" containerID="89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7" exitCode=0 Oct 02 11:23:55 crc kubenswrapper[4835]: I1002 11:23:55.426861 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4k8l" event={"ID":"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc","Type":"ContainerDied","Data":"89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7"} Oct 02 11:23:55 crc kubenswrapper[4835]: I1002 11:23:55.427018 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4k8l" event={"ID":"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc","Type":"ContainerStarted","Data":"9c6e09a889147c1a7a188c60fe44b76aff0a054650cf259dda5a4c5e557d4e52"} Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.438713 4835 generic.go:334] "Generic (PLEG): container finished" podID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" containerID="f0e6c7fa7641434aa57583a02a42f58a570abda3be69c65e7836b0c6e44716b4" exitCode=0 Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.438781 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4k8l" event={"ID":"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc","Type":"ContainerDied","Data":"f0e6c7fa7641434aa57583a02a42f58a570abda3be69c65e7836b0c6e44716b4"} Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.456454 4835 scope.go:117] "RemoveContainer" containerID="4b9690ce3721de722889d416e854e99badf7536706eafcb2b68ce3d7017f3596" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.480425 4835 scope.go:117] "RemoveContainer" containerID="53bab9491bc4f1fa7117c649a765e9d3f854cd3ef6aa0b6951abd3a00bfc8533" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.494975 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lj2vl"] Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.496857 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.506898 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lj2vl"] Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.556846 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-catalog-content\") pod \"certified-operators-lj2vl\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.556934 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-utilities\") pod \"certified-operators-lj2vl\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.557021 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r2r4\" (UniqueName: \"kubernetes.io/projected/04055e43-8a87-42b0-a7c0-bd480ad9a396-kube-api-access-8r2r4\") pod \"certified-operators-lj2vl\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.578259 4835 scope.go:117] "RemoveContainer" containerID="e1d614c7ea85b6f828fe5a6237381a41ab8606912dd9d7f1b3bd9abadf312385" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.616436 4835 scope.go:117] "RemoveContainer" containerID="3f05e99f67eb69762c5e9de1350c42d891c293c38ddcdb2539719037d67fcc33" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.658860 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-utilities\") pod \"certified-operators-lj2vl\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.658972 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r2r4\" (UniqueName: \"kubernetes.io/projected/04055e43-8a87-42b0-a7c0-bd480ad9a396-kube-api-access-8r2r4\") pod \"certified-operators-lj2vl\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.659047 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-catalog-content\") pod \"certified-operators-lj2vl\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.659681 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-catalog-content\") pod \"certified-operators-lj2vl\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.659946 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-utilities\") pod \"certified-operators-lj2vl\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.698828 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r2r4\" (UniqueName: \"kubernetes.io/projected/04055e43-8a87-42b0-a7c0-bd480ad9a396-kube-api-access-8r2r4\") pod \"certified-operators-lj2vl\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.705901 4835 scope.go:117] "RemoveContainer" containerID="9a76b78bc439967afd3c5b37ccdb5c88e396d3cdb1117df0c10bdd2fe3d81ee1" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.735199 4835 scope.go:117] "RemoveContainer" containerID="fc21480867b5920158b6ec75ccf76f6a84da3a714e12b8efa940625c8351f855" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.793312 4835 scope.go:117] "RemoveContainer" containerID="9827952f27a61a56cefbf3fd9d3faf8b888864042d0fafa80597e63a5c2288d5" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.841934 4835 scope.go:117] "RemoveContainer" containerID="7ce377a946ffb86e7ab11f161da89dcf78ee2c757911153e86d558cbd2130c09" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.869941 4835 scope.go:117] "RemoveContainer" containerID="fd42a9561cca9b64ac5d58da62b0197412c8da4160052b49fd6777a827f054e6" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.917032 4835 scope.go:117] "RemoveContainer" containerID="d36b35f4dc40f412e5bf505642090240e0a3bcb55aa5f22d2f9094e7707f035c" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.963913 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:23:56 crc kubenswrapper[4835]: I1002 11:23:56.982592 4835 scope.go:117] "RemoveContainer" containerID="3f3f43474c9b2c2a8e0119c1a1d4fd22a9f2210a32ddc9751185fc54391f00ee" Oct 02 11:23:57 crc kubenswrapper[4835]: I1002 11:23:57.039829 4835 scope.go:117] "RemoveContainer" containerID="edf69bdd7b67918d277fa4aeed8465b90d59b51310177c003f173865baf7b16a" Oct 02 11:23:57 crc kubenswrapper[4835]: I1002 11:23:57.463936 4835 generic.go:334] "Generic (PLEG): container finished" podID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerID="c53663fffe41130f5ccadb4330cf395116b02ec64c5b46714af727c9a7ed4d19" exitCode=0 Oct 02 11:23:57 crc kubenswrapper[4835]: I1002 11:23:57.464009 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94svm" event={"ID":"51aa88a4-9464-47d2-855f-d7865545c0eb","Type":"ContainerDied","Data":"c53663fffe41130f5ccadb4330cf395116b02ec64c5b46714af727c9a7ed4d19"} Oct 02 11:23:57 crc kubenswrapper[4835]: I1002 11:23:57.612433 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lj2vl"] Oct 02 11:23:57 crc kubenswrapper[4835]: W1002 11:23:57.635302 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04055e43_8a87_42b0_a7c0_bd480ad9a396.slice/crio-42a1cafda571025668c9266364e92b6416898619efea2bfb59268fe66e83128c WatchSource:0}: Error finding container 42a1cafda571025668c9266364e92b6416898619efea2bfb59268fe66e83128c: Status 404 returned error can't find the container with id 42a1cafda571025668c9266364e92b6416898619efea2bfb59268fe66e83128c Oct 02 11:23:58 crc kubenswrapper[4835]: I1002 11:23:58.481591 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94svm" event={"ID":"51aa88a4-9464-47d2-855f-d7865545c0eb","Type":"ContainerStarted","Data":"e9d76b3a35710b22c6472e18d2c0b25343382e817c82159af67ca1778f3bcbb3"} Oct 02 11:23:58 crc kubenswrapper[4835]: I1002 11:23:58.484949 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4k8l" event={"ID":"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc","Type":"ContainerStarted","Data":"31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894"} Oct 02 11:23:58 crc kubenswrapper[4835]: I1002 11:23:58.486958 4835 generic.go:334] "Generic (PLEG): container finished" podID="04055e43-8a87-42b0-a7c0-bd480ad9a396" containerID="d79ea336b151b70225f2a89ae40f89c267e64ae883ad26e8f2bc56b233febc3d" exitCode=0 Oct 02 11:23:58 crc kubenswrapper[4835]: I1002 11:23:58.487022 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj2vl" event={"ID":"04055e43-8a87-42b0-a7c0-bd480ad9a396","Type":"ContainerDied","Data":"d79ea336b151b70225f2a89ae40f89c267e64ae883ad26e8f2bc56b233febc3d"} Oct 02 11:23:58 crc kubenswrapper[4835]: I1002 11:23:58.487054 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj2vl" event={"ID":"04055e43-8a87-42b0-a7c0-bd480ad9a396","Type":"ContainerStarted","Data":"42a1cafda571025668c9266364e92b6416898619efea2bfb59268fe66e83128c"} Oct 02 11:23:58 crc kubenswrapper[4835]: I1002 11:23:58.509575 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-94svm" podStartSLOduration=3.021849721 podStartE2EDuration="5.509552847s" podCreationTimestamp="2025-10-02 11:23:53 +0000 UTC" firstStartedPulling="2025-10-02 11:23:55.427148437 +0000 UTC m=+1711.987056018" lastFinishedPulling="2025-10-02 11:23:57.914851563 +0000 UTC m=+1714.474759144" observedRunningTime="2025-10-02 11:23:58.499103234 +0000 UTC m=+1715.059010815" watchObservedRunningTime="2025-10-02 11:23:58.509552847 +0000 UTC m=+1715.069460428" Oct 02 11:23:58 crc kubenswrapper[4835]: I1002 11:23:58.553134 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n4k8l" podStartSLOduration=3.527229786 podStartE2EDuration="5.553112491s" podCreationTimestamp="2025-10-02 11:23:53 +0000 UTC" firstStartedPulling="2025-10-02 11:23:55.428355152 +0000 UTC m=+1711.988262733" lastFinishedPulling="2025-10-02 11:23:57.454237857 +0000 UTC m=+1714.014145438" observedRunningTime="2025-10-02 11:23:58.546920211 +0000 UTC m=+1715.106827792" watchObservedRunningTime="2025-10-02 11:23:58.553112491 +0000 UTC m=+1715.113020072" Oct 02 11:24:00 crc kubenswrapper[4835]: I1002 11:24:00.510288 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj2vl" event={"ID":"04055e43-8a87-42b0-a7c0-bd480ad9a396","Type":"ContainerStarted","Data":"1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561"} Oct 02 11:24:01 crc kubenswrapper[4835]: I1002 11:24:01.519781 4835 generic.go:334] "Generic (PLEG): container finished" podID="04055e43-8a87-42b0-a7c0-bd480ad9a396" containerID="1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561" exitCode=0 Oct 02 11:24:01 crc kubenswrapper[4835]: I1002 11:24:01.519891 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj2vl" event={"ID":"04055e43-8a87-42b0-a7c0-bd480ad9a396","Type":"ContainerDied","Data":"1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561"} Oct 02 11:24:02 crc kubenswrapper[4835]: I1002 11:24:02.532628 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj2vl" event={"ID":"04055e43-8a87-42b0-a7c0-bd480ad9a396","Type":"ContainerStarted","Data":"eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc"} Oct 02 11:24:02 crc kubenswrapper[4835]: I1002 11:24:02.563529 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lj2vl" podStartSLOduration=3.129130783 podStartE2EDuration="6.56351s" podCreationTimestamp="2025-10-02 11:23:56 +0000 UTC" firstStartedPulling="2025-10-02 11:23:58.488327082 +0000 UTC m=+1715.048234663" lastFinishedPulling="2025-10-02 11:24:01.922706299 +0000 UTC m=+1718.482613880" observedRunningTime="2025-10-02 11:24:02.557672611 +0000 UTC m=+1719.117580192" watchObservedRunningTime="2025-10-02 11:24:02.56351 +0000 UTC m=+1719.123417581" Oct 02 11:24:03 crc kubenswrapper[4835]: I1002 11:24:03.650752 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:24:03 crc kubenswrapper[4835]: I1002 11:24:03.654324 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:24:03 crc kubenswrapper[4835]: I1002 11:24:03.806955 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:24:03 crc kubenswrapper[4835]: I1002 11:24:03.807378 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:24:03 crc kubenswrapper[4835]: I1002 11:24:03.854893 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:24:04 crc kubenswrapper[4835]: I1002 11:24:04.611523 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:24:04 crc kubenswrapper[4835]: I1002 11:24:04.737251 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-94svm" podUID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerName="registry-server" probeResult="failure" output=< Oct 02 11:24:04 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Oct 02 11:24:04 crc kubenswrapper[4835]: > Oct 02 11:24:06 crc kubenswrapper[4835]: I1002 11:24:06.252541 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:24:06 crc kubenswrapper[4835]: E1002 11:24:06.253312 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:24:06 crc kubenswrapper[4835]: I1002 11:24:06.280291 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4k8l"] Oct 02 11:24:06 crc kubenswrapper[4835]: I1002 11:24:06.964668 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:24:06 crc kubenswrapper[4835]: I1002 11:24:06.964904 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:24:07 crc kubenswrapper[4835]: I1002 11:24:07.015407 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:24:07 crc kubenswrapper[4835]: I1002 11:24:07.592422 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n4k8l" podUID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" containerName="registry-server" containerID="cri-o://31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894" gracePeriod=2 Oct 02 11:24:07 crc kubenswrapper[4835]: I1002 11:24:07.637645 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.044262 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.132053 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-utilities\") pod \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.132126 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-catalog-content\") pod \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.132183 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z49v5\" (UniqueName: \"kubernetes.io/projected/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-kube-api-access-z49v5\") pod \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\" (UID: \"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc\") " Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.132779 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-utilities" (OuterVolumeSpecName: "utilities") pod "ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" (UID: "ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.140532 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-kube-api-access-z49v5" (OuterVolumeSpecName: "kube-api-access-z49v5") pod "ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" (UID: "ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc"). InnerVolumeSpecName "kube-api-access-z49v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.186564 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" (UID: "ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.234688 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.234727 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.234739 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z49v5\" (UniqueName: \"kubernetes.io/projected/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc-kube-api-access-z49v5\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.602925 4835 generic.go:334] "Generic (PLEG): container finished" podID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" containerID="31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894" exitCode=0 Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.602967 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4k8l" event={"ID":"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc","Type":"ContainerDied","Data":"31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894"} Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.603019 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4k8l" event={"ID":"ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc","Type":"ContainerDied","Data":"9c6e09a889147c1a7a188c60fe44b76aff0a054650cf259dda5a4c5e557d4e52"} Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.603017 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4k8l" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.603039 4835 scope.go:117] "RemoveContainer" containerID="31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.627477 4835 scope.go:117] "RemoveContainer" containerID="f0e6c7fa7641434aa57583a02a42f58a570abda3be69c65e7836b0c6e44716b4" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.630116 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4k8l"] Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.637958 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n4k8l"] Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.648467 4835 scope.go:117] "RemoveContainer" containerID="89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.707751 4835 scope.go:117] "RemoveContainer" containerID="31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894" Oct 02 11:24:08 crc kubenswrapper[4835]: E1002 11:24:08.708326 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894\": container with ID starting with 31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894 not found: ID does not exist" containerID="31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.708358 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894"} err="failed to get container status \"31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894\": rpc error: code = NotFound desc = could not find container \"31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894\": container with ID starting with 31b7f4fb86f713e2b55c4f10136465cddbf20affccfcf2e7a1bcff8acf2ee894 not found: ID does not exist" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.708381 4835 scope.go:117] "RemoveContainer" containerID="f0e6c7fa7641434aa57583a02a42f58a570abda3be69c65e7836b0c6e44716b4" Oct 02 11:24:08 crc kubenswrapper[4835]: E1002 11:24:08.708692 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e6c7fa7641434aa57583a02a42f58a570abda3be69c65e7836b0c6e44716b4\": container with ID starting with f0e6c7fa7641434aa57583a02a42f58a570abda3be69c65e7836b0c6e44716b4 not found: ID does not exist" containerID="f0e6c7fa7641434aa57583a02a42f58a570abda3be69c65e7836b0c6e44716b4" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.708760 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e6c7fa7641434aa57583a02a42f58a570abda3be69c65e7836b0c6e44716b4"} err="failed to get container status \"f0e6c7fa7641434aa57583a02a42f58a570abda3be69c65e7836b0c6e44716b4\": rpc error: code = NotFound desc = could not find container \"f0e6c7fa7641434aa57583a02a42f58a570abda3be69c65e7836b0c6e44716b4\": container with ID starting with f0e6c7fa7641434aa57583a02a42f58a570abda3be69c65e7836b0c6e44716b4 not found: ID does not exist" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.708787 4835 scope.go:117] "RemoveContainer" containerID="89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7" Oct 02 11:24:08 crc kubenswrapper[4835]: E1002 11:24:08.709112 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7\": container with ID starting with 89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7 not found: ID does not exist" containerID="89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7" Oct 02 11:24:08 crc kubenswrapper[4835]: I1002 11:24:08.709144 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7"} err="failed to get container status \"89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7\": rpc error: code = NotFound desc = could not find container \"89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7\": container with ID starting with 89af58d53b801ec1477aad306ad5f1a68a6501304020776a3abe4414e131f4e7 not found: ID does not exist" Oct 02 11:24:09 crc kubenswrapper[4835]: I1002 11:24:09.281676 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lj2vl"] Oct 02 11:24:10 crc kubenswrapper[4835]: I1002 11:24:10.262819 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" path="/var/lib/kubelet/pods/ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc/volumes" Oct 02 11:24:10 crc kubenswrapper[4835]: I1002 11:24:10.643113 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lj2vl" podUID="04055e43-8a87-42b0-a7c0-bd480ad9a396" containerName="registry-server" containerID="cri-o://eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc" gracePeriod=2 Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.107979 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.188788 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-catalog-content\") pod \"04055e43-8a87-42b0-a7c0-bd480ad9a396\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.189206 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-utilities\") pod \"04055e43-8a87-42b0-a7c0-bd480ad9a396\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.189371 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r2r4\" (UniqueName: \"kubernetes.io/projected/04055e43-8a87-42b0-a7c0-bd480ad9a396-kube-api-access-8r2r4\") pod \"04055e43-8a87-42b0-a7c0-bd480ad9a396\" (UID: \"04055e43-8a87-42b0-a7c0-bd480ad9a396\") " Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.190659 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-utilities" (OuterVolumeSpecName: "utilities") pod "04055e43-8a87-42b0-a7c0-bd480ad9a396" (UID: "04055e43-8a87-42b0-a7c0-bd480ad9a396"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.195164 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04055e43-8a87-42b0-a7c0-bd480ad9a396-kube-api-access-8r2r4" (OuterVolumeSpecName: "kube-api-access-8r2r4") pod "04055e43-8a87-42b0-a7c0-bd480ad9a396" (UID: "04055e43-8a87-42b0-a7c0-bd480ad9a396"). InnerVolumeSpecName "kube-api-access-8r2r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.231879 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04055e43-8a87-42b0-a7c0-bd480ad9a396" (UID: "04055e43-8a87-42b0-a7c0-bd480ad9a396"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.290988 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.291023 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r2r4\" (UniqueName: \"kubernetes.io/projected/04055e43-8a87-42b0-a7c0-bd480ad9a396-kube-api-access-8r2r4\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.291032 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04055e43-8a87-42b0-a7c0-bd480ad9a396-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.670781 4835 generic.go:334] "Generic (PLEG): container finished" podID="04055e43-8a87-42b0-a7c0-bd480ad9a396" containerID="eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc" exitCode=0 Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.670836 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj2vl" event={"ID":"04055e43-8a87-42b0-a7c0-bd480ad9a396","Type":"ContainerDied","Data":"eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc"} Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.670867 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj2vl" event={"ID":"04055e43-8a87-42b0-a7c0-bd480ad9a396","Type":"ContainerDied","Data":"42a1cafda571025668c9266364e92b6416898619efea2bfb59268fe66e83128c"} Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.670891 4835 scope.go:117] "RemoveContainer" containerID="eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.670997 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lj2vl" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.700935 4835 scope.go:117] "RemoveContainer" containerID="1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.715472 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lj2vl"] Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.722142 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lj2vl"] Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.743115 4835 scope.go:117] "RemoveContainer" containerID="d79ea336b151b70225f2a89ae40f89c267e64ae883ad26e8f2bc56b233febc3d" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.765782 4835 scope.go:117] "RemoveContainer" containerID="eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc" Oct 02 11:24:11 crc kubenswrapper[4835]: E1002 11:24:11.766313 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc\": container with ID starting with eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc not found: ID does not exist" containerID="eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.766353 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc"} err="failed to get container status \"eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc\": rpc error: code = NotFound desc = could not find container \"eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc\": container with ID starting with eb4c189816bfd89fc7fc44da08c261239ee59b45c7952e880d5658d889f417bc not found: ID does not exist" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.766376 4835 scope.go:117] "RemoveContainer" containerID="1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561" Oct 02 11:24:11 crc kubenswrapper[4835]: E1002 11:24:11.766738 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561\": container with ID starting with 1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561 not found: ID does not exist" containerID="1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.766795 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561"} err="failed to get container status \"1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561\": rpc error: code = NotFound desc = could not find container \"1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561\": container with ID starting with 1177689e0aade93a9b4d7bfe87cd7c579ac559b003274874d0c67b0b5a192561 not found: ID does not exist" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.766849 4835 scope.go:117] "RemoveContainer" containerID="d79ea336b151b70225f2a89ae40f89c267e64ae883ad26e8f2bc56b233febc3d" Oct 02 11:24:11 crc kubenswrapper[4835]: E1002 11:24:11.767185 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79ea336b151b70225f2a89ae40f89c267e64ae883ad26e8f2bc56b233febc3d\": container with ID starting with d79ea336b151b70225f2a89ae40f89c267e64ae883ad26e8f2bc56b233febc3d not found: ID does not exist" containerID="d79ea336b151b70225f2a89ae40f89c267e64ae883ad26e8f2bc56b233febc3d" Oct 02 11:24:11 crc kubenswrapper[4835]: I1002 11:24:11.767215 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79ea336b151b70225f2a89ae40f89c267e64ae883ad26e8f2bc56b233febc3d"} err="failed to get container status \"d79ea336b151b70225f2a89ae40f89c267e64ae883ad26e8f2bc56b233febc3d\": rpc error: code = NotFound desc = could not find container \"d79ea336b151b70225f2a89ae40f89c267e64ae883ad26e8f2bc56b233febc3d\": container with ID starting with d79ea336b151b70225f2a89ae40f89c267e64ae883ad26e8f2bc56b233febc3d not found: ID does not exist" Oct 02 11:24:12 crc kubenswrapper[4835]: I1002 11:24:12.264769 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04055e43-8a87-42b0-a7c0-bd480ad9a396" path="/var/lib/kubelet/pods/04055e43-8a87-42b0-a7c0-bd480ad9a396/volumes" Oct 02 11:24:14 crc kubenswrapper[4835]: I1002 11:24:14.707157 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-94svm" podUID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerName="registry-server" probeResult="failure" output=< Oct 02 11:24:14 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Oct 02 11:24:14 crc kubenswrapper[4835]: > Oct 02 11:24:18 crc kubenswrapper[4835]: I1002 11:24:18.252952 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:24:18 crc kubenswrapper[4835]: E1002 11:24:18.253567 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:24:19 crc kubenswrapper[4835]: I1002 11:24:19.741392 4835 generic.go:334] "Generic (PLEG): container finished" podID="2ed489f8-6c30-403d-8634-2c67229b4114" containerID="acac1ce3832e8a4ea0cff66ecd507902e9ecd7896984dfd99b4dc495c33fa7de" exitCode=0 Oct 02 11:24:19 crc kubenswrapper[4835]: I1002 11:24:19.741450 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" event={"ID":"2ed489f8-6c30-403d-8634-2c67229b4114","Type":"ContainerDied","Data":"acac1ce3832e8a4ea0cff66ecd507902e9ecd7896984dfd99b4dc495c33fa7de"} Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.184636 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.371419 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-ssh-key\") pod \"2ed489f8-6c30-403d-8634-2c67229b4114\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.371545 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6b88\" (UniqueName: \"kubernetes.io/projected/2ed489f8-6c30-403d-8634-2c67229b4114-kube-api-access-p6b88\") pod \"2ed489f8-6c30-403d-8634-2c67229b4114\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.371601 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-inventory\") pod \"2ed489f8-6c30-403d-8634-2c67229b4114\" (UID: \"2ed489f8-6c30-403d-8634-2c67229b4114\") " Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.377695 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed489f8-6c30-403d-8634-2c67229b4114-kube-api-access-p6b88" (OuterVolumeSpecName: "kube-api-access-p6b88") pod "2ed489f8-6c30-403d-8634-2c67229b4114" (UID: "2ed489f8-6c30-403d-8634-2c67229b4114"). InnerVolumeSpecName "kube-api-access-p6b88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.396700 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2ed489f8-6c30-403d-8634-2c67229b4114" (UID: "2ed489f8-6c30-403d-8634-2c67229b4114"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.398731 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-inventory" (OuterVolumeSpecName: "inventory") pod "2ed489f8-6c30-403d-8634-2c67229b4114" (UID: "2ed489f8-6c30-403d-8634-2c67229b4114"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.473624 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.473656 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6b88\" (UniqueName: \"kubernetes.io/projected/2ed489f8-6c30-403d-8634-2c67229b4114-kube-api-access-p6b88\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.473670 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ed489f8-6c30-403d-8634-2c67229b4114-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.760347 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" event={"ID":"2ed489f8-6c30-403d-8634-2c67229b4114","Type":"ContainerDied","Data":"3380702e46c3ce15895ac88934ca952d4a74b741d8adeb29a7458bdd24d60b79"} Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.760394 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3380702e46c3ce15895ac88934ca952d4a74b741d8adeb29a7458bdd24d60b79" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.760423 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.843410 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm"] Oct 02 11:24:21 crc kubenswrapper[4835]: E1002 11:24:21.843858 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04055e43-8a87-42b0-a7c0-bd480ad9a396" containerName="extract-utilities" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.843882 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="04055e43-8a87-42b0-a7c0-bd480ad9a396" containerName="extract-utilities" Oct 02 11:24:21 crc kubenswrapper[4835]: E1002 11:24:21.843909 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04055e43-8a87-42b0-a7c0-bd480ad9a396" containerName="extract-content" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.843919 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="04055e43-8a87-42b0-a7c0-bd480ad9a396" containerName="extract-content" Oct 02 11:24:21 crc kubenswrapper[4835]: E1002 11:24:21.843947 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" containerName="extract-content" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.843955 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" containerName="extract-content" Oct 02 11:24:21 crc kubenswrapper[4835]: E1002 11:24:21.843977 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04055e43-8a87-42b0-a7c0-bd480ad9a396" containerName="registry-server" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.843986 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="04055e43-8a87-42b0-a7c0-bd480ad9a396" containerName="registry-server" Oct 02 11:24:21 crc kubenswrapper[4835]: E1002 11:24:21.844000 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" containerName="registry-server" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.844007 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" containerName="registry-server" Oct 02 11:24:21 crc kubenswrapper[4835]: E1002 11:24:21.844023 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed489f8-6c30-403d-8634-2c67229b4114" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.844032 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed489f8-6c30-403d-8634-2c67229b4114" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:24:21 crc kubenswrapper[4835]: E1002 11:24:21.844049 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" containerName="extract-utilities" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.844058 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" containerName="extract-utilities" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.844274 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed489f8-6c30-403d-8634-2c67229b4114" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.844308 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="04055e43-8a87-42b0-a7c0-bd480ad9a396" containerName="registry-server" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.844325 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd8b932-c11e-4914-b7f8-bb5b1f13e9bc" containerName="registry-server" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.845092 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.847747 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.847943 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.848114 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.859107 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.861518 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm"] Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.984390 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.984818 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t6jz\" (UniqueName: \"kubernetes.io/projected/71be51be-d812-42b5-9579-e6d50e13eeda-kube-api-access-5t6jz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:21 crc kubenswrapper[4835]: I1002 11:24:21.984998 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:22 crc kubenswrapper[4835]: I1002 11:24:22.086141 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t6jz\" (UniqueName: \"kubernetes.io/projected/71be51be-d812-42b5-9579-e6d50e13eeda-kube-api-access-5t6jz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:22 crc kubenswrapper[4835]: I1002 11:24:22.086239 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:22 crc kubenswrapper[4835]: I1002 11:24:22.086310 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:22 crc kubenswrapper[4835]: I1002 11:24:22.090118 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:22 crc kubenswrapper[4835]: I1002 11:24:22.090464 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:22 crc kubenswrapper[4835]: I1002 11:24:22.104706 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t6jz\" (UniqueName: \"kubernetes.io/projected/71be51be-d812-42b5-9579-e6d50e13eeda-kube-api-access-5t6jz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:22 crc kubenswrapper[4835]: I1002 11:24:22.176767 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:22 crc kubenswrapper[4835]: I1002 11:24:22.713602 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm"] Oct 02 11:24:22 crc kubenswrapper[4835]: I1002 11:24:22.770759 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" event={"ID":"71be51be-d812-42b5-9579-e6d50e13eeda","Type":"ContainerStarted","Data":"f51aff7c30d46098e5f8f1ddfd6c0474b8ad652bedc636f43ca109002fe7947a"} Oct 02 11:24:23 crc kubenswrapper[4835]: I1002 11:24:23.037615 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6f51-account-create-7wtfl"] Oct 02 11:24:23 crc kubenswrapper[4835]: I1002 11:24:23.044991 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-73f9-account-create-kgdkc"] Oct 02 11:24:23 crc kubenswrapper[4835]: I1002 11:24:23.052735 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6f51-account-create-7wtfl"] Oct 02 11:24:23 crc kubenswrapper[4835]: I1002 11:24:23.059612 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-73f9-account-create-kgdkc"] Oct 02 11:24:23 crc kubenswrapper[4835]: I1002 11:24:23.708902 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:24:23 crc kubenswrapper[4835]: I1002 11:24:23.764575 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:24:23 crc kubenswrapper[4835]: I1002 11:24:23.781634 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" event={"ID":"71be51be-d812-42b5-9579-e6d50e13eeda","Type":"ContainerStarted","Data":"6f492035d9d9991cbe6d8eca9baaedb65aba3144732de85280f28a3da60d82da"} Oct 02 11:24:23 crc kubenswrapper[4835]: I1002 11:24:23.802311 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" podStartSLOduration=2.369177994 podStartE2EDuration="2.802291532s" podCreationTimestamp="2025-10-02 11:24:21 +0000 UTC" firstStartedPulling="2025-10-02 11:24:22.715955533 +0000 UTC m=+1739.275863114" lastFinishedPulling="2025-10-02 11:24:23.149069071 +0000 UTC m=+1739.708976652" observedRunningTime="2025-10-02 11:24:23.798524343 +0000 UTC m=+1740.358431924" watchObservedRunningTime="2025-10-02 11:24:23.802291532 +0000 UTC m=+1740.362199123" Oct 02 11:24:24 crc kubenswrapper[4835]: I1002 11:24:24.266246 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="492bc087-f2ed-4b40-8fe0-74accde085ce" path="/var/lib/kubelet/pods/492bc087-f2ed-4b40-8fe0-74accde085ce/volumes" Oct 02 11:24:24 crc kubenswrapper[4835]: I1002 11:24:24.266802 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53018b21-5abb-49d3-9098-041193565c81" path="/var/lib/kubelet/pods/53018b21-5abb-49d3-9098-041193565c81/volumes" Oct 02 11:24:26 crc kubenswrapper[4835]: I1002 11:24:26.032929 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vg55h"] Oct 02 11:24:26 crc kubenswrapper[4835]: I1002 11:24:26.040086 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vg55h"] Oct 02 11:24:26 crc kubenswrapper[4835]: I1002 11:24:26.262504 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="152bf752-1382-4613-84cb-a392f0666d6c" path="/var/lib/kubelet/pods/152bf752-1382-4613-84cb-a392f0666d6c/volumes" Oct 02 11:24:27 crc kubenswrapper[4835]: I1002 11:24:27.678645 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-94svm"] Oct 02 11:24:27 crc kubenswrapper[4835]: I1002 11:24:27.679082 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-94svm" podUID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerName="registry-server" containerID="cri-o://e9d76b3a35710b22c6472e18d2c0b25343382e817c82159af67ca1778f3bcbb3" gracePeriod=2 Oct 02 11:24:27 crc kubenswrapper[4835]: I1002 11:24:27.829701 4835 generic.go:334] "Generic (PLEG): container finished" podID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerID="e9d76b3a35710b22c6472e18d2c0b25343382e817c82159af67ca1778f3bcbb3" exitCode=0 Oct 02 11:24:27 crc kubenswrapper[4835]: I1002 11:24:27.829748 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94svm" event={"ID":"51aa88a4-9464-47d2-855f-d7865545c0eb","Type":"ContainerDied","Data":"e9d76b3a35710b22c6472e18d2c0b25343382e817c82159af67ca1778f3bcbb3"} Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.212089 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.290211 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-utilities\") pod \"51aa88a4-9464-47d2-855f-d7865545c0eb\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.290279 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-catalog-content\") pod \"51aa88a4-9464-47d2-855f-d7865545c0eb\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.290593 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rs96\" (UniqueName: \"kubernetes.io/projected/51aa88a4-9464-47d2-855f-d7865545c0eb-kube-api-access-4rs96\") pod \"51aa88a4-9464-47d2-855f-d7865545c0eb\" (UID: \"51aa88a4-9464-47d2-855f-d7865545c0eb\") " Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.291217 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-utilities" (OuterVolumeSpecName: "utilities") pod "51aa88a4-9464-47d2-855f-d7865545c0eb" (UID: "51aa88a4-9464-47d2-855f-d7865545c0eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.291336 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.296124 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51aa88a4-9464-47d2-855f-d7865545c0eb-kube-api-access-4rs96" (OuterVolumeSpecName: "kube-api-access-4rs96") pod "51aa88a4-9464-47d2-855f-d7865545c0eb" (UID: "51aa88a4-9464-47d2-855f-d7865545c0eb"). InnerVolumeSpecName "kube-api-access-4rs96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.393354 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rs96\" (UniqueName: \"kubernetes.io/projected/51aa88a4-9464-47d2-855f-d7865545c0eb-kube-api-access-4rs96\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.399452 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51aa88a4-9464-47d2-855f-d7865545c0eb" (UID: "51aa88a4-9464-47d2-855f-d7865545c0eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.495009 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51aa88a4-9464-47d2-855f-d7865545c0eb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.847369 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94svm" event={"ID":"51aa88a4-9464-47d2-855f-d7865545c0eb","Type":"ContainerDied","Data":"10c776646d4d392cc4ff43aa6ea61ecc6a12707bad82124c850b249ed57bcc11"} Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.847443 4835 scope.go:117] "RemoveContainer" containerID="e9d76b3a35710b22c6472e18d2c0b25343382e817c82159af67ca1778f3bcbb3" Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.847662 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94svm" Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.854037 4835 generic.go:334] "Generic (PLEG): container finished" podID="71be51be-d812-42b5-9579-e6d50e13eeda" containerID="6f492035d9d9991cbe6d8eca9baaedb65aba3144732de85280f28a3da60d82da" exitCode=0 Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.854080 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" event={"ID":"71be51be-d812-42b5-9579-e6d50e13eeda","Type":"ContainerDied","Data":"6f492035d9d9991cbe6d8eca9baaedb65aba3144732de85280f28a3da60d82da"} Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.880172 4835 scope.go:117] "RemoveContainer" containerID="c53663fffe41130f5ccadb4330cf395116b02ec64c5b46714af727c9a7ed4d19" Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.899632 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-94svm"] Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.908032 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-94svm"] Oct 02 11:24:28 crc kubenswrapper[4835]: I1002 11:24:28.914075 4835 scope.go:117] "RemoveContainer" containerID="1e8e7a6f06e32eea0990fda1433d2694237f40a8999b98748c4d3f79dae7ce95" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.262285 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.262746 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51aa88a4-9464-47d2-855f-d7865545c0eb" path="/var/lib/kubelet/pods/51aa88a4-9464-47d2-855f-d7865545c0eb/volumes" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.429034 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-ssh-key\") pod \"71be51be-d812-42b5-9579-e6d50e13eeda\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.429365 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-inventory\") pod \"71be51be-d812-42b5-9579-e6d50e13eeda\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.429518 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t6jz\" (UniqueName: \"kubernetes.io/projected/71be51be-d812-42b5-9579-e6d50e13eeda-kube-api-access-5t6jz\") pod \"71be51be-d812-42b5-9579-e6d50e13eeda\" (UID: \"71be51be-d812-42b5-9579-e6d50e13eeda\") " Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.434621 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71be51be-d812-42b5-9579-e6d50e13eeda-kube-api-access-5t6jz" (OuterVolumeSpecName: "kube-api-access-5t6jz") pod "71be51be-d812-42b5-9579-e6d50e13eeda" (UID: "71be51be-d812-42b5-9579-e6d50e13eeda"). InnerVolumeSpecName "kube-api-access-5t6jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.460970 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "71be51be-d812-42b5-9579-e6d50e13eeda" (UID: "71be51be-d812-42b5-9579-e6d50e13eeda"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.461606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-inventory" (OuterVolumeSpecName: "inventory") pod "71be51be-d812-42b5-9579-e6d50e13eeda" (UID: "71be51be-d812-42b5-9579-e6d50e13eeda"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.531717 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.531741 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71be51be-d812-42b5-9579-e6d50e13eeda-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.531752 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t6jz\" (UniqueName: \"kubernetes.io/projected/71be51be-d812-42b5-9579-e6d50e13eeda-kube-api-access-5t6jz\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.878697 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" event={"ID":"71be51be-d812-42b5-9579-e6d50e13eeda","Type":"ContainerDied","Data":"f51aff7c30d46098e5f8f1ddfd6c0474b8ad652bedc636f43ca109002fe7947a"} Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.878747 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f51aff7c30d46098e5f8f1ddfd6c0474b8ad652bedc636f43ca109002fe7947a" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.878776 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.945647 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9"] Oct 02 11:24:31 crc kubenswrapper[4835]: E1002 11:24:30.946066 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71be51be-d812-42b5-9579-e6d50e13eeda" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.946082 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="71be51be-d812-42b5-9579-e6d50e13eeda" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:24:31 crc kubenswrapper[4835]: E1002 11:24:30.946101 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerName="extract-content" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.946109 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerName="extract-content" Oct 02 11:24:31 crc kubenswrapper[4835]: E1002 11:24:30.946128 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerName="registry-server" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.946136 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerName="registry-server" Oct 02 11:24:31 crc kubenswrapper[4835]: E1002 11:24:30.946161 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerName="extract-utilities" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.946170 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerName="extract-utilities" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.946460 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="71be51be-d812-42b5-9579-e6d50e13eeda" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.946491 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="51aa88a4-9464-47d2-855f-d7865545c0eb" containerName="registry-server" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.947390 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.952757 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9"] Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.981898 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.982067 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.982187 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:30.982372 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.041305 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c86lx\" (UniqueName: \"kubernetes.io/projected/0be9db5e-b163-4f34-9030-c565b5da60ca-kube-api-access-c86lx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mfld9\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.041372 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mfld9\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.041482 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mfld9\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.143834 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mfld9\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.144057 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c86lx\" (UniqueName: \"kubernetes.io/projected/0be9db5e-b163-4f34-9030-c565b5da60ca-kube-api-access-c86lx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mfld9\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.144109 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mfld9\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.151874 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mfld9\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.152229 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mfld9\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.161105 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c86lx\" (UniqueName: \"kubernetes.io/projected/0be9db5e-b163-4f34-9030-c565b5da60ca-kube-api-access-c86lx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mfld9\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.309715 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.805488 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9"] Oct 02 11:24:31 crc kubenswrapper[4835]: I1002 11:24:31.888603 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" event={"ID":"0be9db5e-b163-4f34-9030-c565b5da60ca","Type":"ContainerStarted","Data":"aced3bc6917d5fec4521a1039f5d6c0f75c3733206aa15b4a9f7c8e16bbdb71f"} Oct 02 11:24:32 crc kubenswrapper[4835]: I1002 11:24:32.251845 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:24:32 crc kubenswrapper[4835]: E1002 11:24:32.252166 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:24:32 crc kubenswrapper[4835]: I1002 11:24:32.912890 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" event={"ID":"0be9db5e-b163-4f34-9030-c565b5da60ca","Type":"ContainerStarted","Data":"405caea66cbf92d42f0762d9c583403142c761591f239825d7ba292c57ca7edf"} Oct 02 11:24:32 crc kubenswrapper[4835]: I1002 11:24:32.935283 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" podStartSLOduration=2.209838737 podStartE2EDuration="2.935263773s" podCreationTimestamp="2025-10-02 11:24:30 +0000 UTC" firstStartedPulling="2025-10-02 11:24:31.810757825 +0000 UTC m=+1748.370665406" lastFinishedPulling="2025-10-02 11:24:32.536182861 +0000 UTC m=+1749.096090442" observedRunningTime="2025-10-02 11:24:32.925792358 +0000 UTC m=+1749.485699939" watchObservedRunningTime="2025-10-02 11:24:32.935263773 +0000 UTC m=+1749.495171354" Oct 02 11:24:41 crc kubenswrapper[4835]: I1002 11:24:41.038787 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jwqkm"] Oct 02 11:24:41 crc kubenswrapper[4835]: I1002 11:24:41.048372 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jwqkm"] Oct 02 11:24:42 crc kubenswrapper[4835]: I1002 11:24:42.264153 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01665ace-3f34-4029-a202-6f350e8497f9" path="/var/lib/kubelet/pods/01665ace-3f34-4029-a202-6f350e8497f9/volumes" Oct 02 11:24:47 crc kubenswrapper[4835]: I1002 11:24:47.252336 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:24:47 crc kubenswrapper[4835]: E1002 11:24:47.253648 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:24:57 crc kubenswrapper[4835]: I1002 11:24:57.740994 4835 scope.go:117] "RemoveContainer" containerID="584b421f8a9321a47c9d94629bf8e2bd75fa8175eda32b2d55c3384b1fdf6f90" Oct 02 11:24:57 crc kubenswrapper[4835]: I1002 11:24:57.785976 4835 scope.go:117] "RemoveContainer" containerID="5dc34fc11def9148cac8bfdb3f4fd344006ef1894c9394c9b2ee6c62304082ef" Oct 02 11:24:57 crc kubenswrapper[4835]: I1002 11:24:57.823781 4835 scope.go:117] "RemoveContainer" containerID="cb8cf6922fd514e24e9fa8c0746f61209d84a55c0f311ae3fe3fd776f075bb5d" Oct 02 11:24:57 crc kubenswrapper[4835]: I1002 11:24:57.870261 4835 scope.go:117] "RemoveContainer" containerID="aeec7d539aee1c9aca026dcc0da91002669bb04ab1b72ce2177752885d3e5f88" Oct 02 11:25:00 crc kubenswrapper[4835]: I1002 11:25:00.251874 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:25:00 crc kubenswrapper[4835]: E1002 11:25:00.252614 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:25:06 crc kubenswrapper[4835]: I1002 11:25:06.196488 4835 generic.go:334] "Generic (PLEG): container finished" podID="0be9db5e-b163-4f34-9030-c565b5da60ca" containerID="405caea66cbf92d42f0762d9c583403142c761591f239825d7ba292c57ca7edf" exitCode=0 Oct 02 11:25:06 crc kubenswrapper[4835]: I1002 11:25:06.196588 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" event={"ID":"0be9db5e-b163-4f34-9030-c565b5da60ca","Type":"ContainerDied","Data":"405caea66cbf92d42f0762d9c583403142c761591f239825d7ba292c57ca7edf"} Oct 02 11:25:07 crc kubenswrapper[4835]: I1002 11:25:07.572181 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:25:07 crc kubenswrapper[4835]: I1002 11:25:07.710200 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-ssh-key\") pod \"0be9db5e-b163-4f34-9030-c565b5da60ca\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " Oct 02 11:25:07 crc kubenswrapper[4835]: I1002 11:25:07.710362 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c86lx\" (UniqueName: \"kubernetes.io/projected/0be9db5e-b163-4f34-9030-c565b5da60ca-kube-api-access-c86lx\") pod \"0be9db5e-b163-4f34-9030-c565b5da60ca\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " Oct 02 11:25:07 crc kubenswrapper[4835]: I1002 11:25:07.710379 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-inventory\") pod \"0be9db5e-b163-4f34-9030-c565b5da60ca\" (UID: \"0be9db5e-b163-4f34-9030-c565b5da60ca\") " Oct 02 11:25:07 crc kubenswrapper[4835]: I1002 11:25:07.717098 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be9db5e-b163-4f34-9030-c565b5da60ca-kube-api-access-c86lx" (OuterVolumeSpecName: "kube-api-access-c86lx") pod "0be9db5e-b163-4f34-9030-c565b5da60ca" (UID: "0be9db5e-b163-4f34-9030-c565b5da60ca"). InnerVolumeSpecName "kube-api-access-c86lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:25:07 crc kubenswrapper[4835]: I1002 11:25:07.736874 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-inventory" (OuterVolumeSpecName: "inventory") pod "0be9db5e-b163-4f34-9030-c565b5da60ca" (UID: "0be9db5e-b163-4f34-9030-c565b5da60ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:25:07 crc kubenswrapper[4835]: I1002 11:25:07.737852 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0be9db5e-b163-4f34-9030-c565b5da60ca" (UID: "0be9db5e-b163-4f34-9030-c565b5da60ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:25:07 crc kubenswrapper[4835]: I1002 11:25:07.812096 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c86lx\" (UniqueName: \"kubernetes.io/projected/0be9db5e-b163-4f34-9030-c565b5da60ca-kube-api-access-c86lx\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:07 crc kubenswrapper[4835]: I1002 11:25:07.812149 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:07 crc kubenswrapper[4835]: I1002 11:25:07.812163 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0be9db5e-b163-4f34-9030-c565b5da60ca-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.214169 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" event={"ID":"0be9db5e-b163-4f34-9030-c565b5da60ca","Type":"ContainerDied","Data":"aced3bc6917d5fec4521a1039f5d6c0f75c3733206aa15b4a9f7c8e16bbdb71f"} Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.214247 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.214218 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aced3bc6917d5fec4521a1039f5d6c0f75c3733206aa15b4a9f7c8e16bbdb71f" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.293016 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt"] Oct 02 11:25:08 crc kubenswrapper[4835]: E1002 11:25:08.293515 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be9db5e-b163-4f34-9030-c565b5da60ca" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.293541 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be9db5e-b163-4f34-9030-c565b5da60ca" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.293779 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be9db5e-b163-4f34-9030-c565b5da60ca" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.294561 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.296917 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.305445 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.305657 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.306339 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt"] Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.307593 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.421502 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brknq\" (UniqueName: \"kubernetes.io/projected/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-kube-api-access-brknq\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.421556 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.421596 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.523354 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brknq\" (UniqueName: \"kubernetes.io/projected/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-kube-api-access-brknq\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.523428 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.523474 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.529731 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.529828 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.543355 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brknq\" (UniqueName: \"kubernetes.io/projected/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-kube-api-access-brknq\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:08 crc kubenswrapper[4835]: I1002 11:25:08.613079 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:09 crc kubenswrapper[4835]: I1002 11:25:09.161134 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt"] Oct 02 11:25:09 crc kubenswrapper[4835]: I1002 11:25:09.223580 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" event={"ID":"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8","Type":"ContainerStarted","Data":"204375b1f78de054b80ef6022f4f47b8ce35770489fc9d2e3565d5db59d63f5c"} Oct 02 11:25:11 crc kubenswrapper[4835]: I1002 11:25:11.249955 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" event={"ID":"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8","Type":"ContainerStarted","Data":"275aad0297460f7c2885f2d92f56717591343c14a05d5ccc4fa6d32bfa99c4ee"} Oct 02 11:25:11 crc kubenswrapper[4835]: I1002 11:25:11.271602 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" podStartSLOduration=2.336345345 podStartE2EDuration="3.271578129s" podCreationTimestamp="2025-10-02 11:25:08 +0000 UTC" firstStartedPulling="2025-10-02 11:25:09.158960983 +0000 UTC m=+1785.718868564" lastFinishedPulling="2025-10-02 11:25:10.094193757 +0000 UTC m=+1786.654101348" observedRunningTime="2025-10-02 11:25:11.26611553 +0000 UTC m=+1787.826023111" watchObservedRunningTime="2025-10-02 11:25:11.271578129 +0000 UTC m=+1787.831485720" Oct 02 11:25:12 crc kubenswrapper[4835]: I1002 11:25:12.251606 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:25:12 crc kubenswrapper[4835]: E1002 11:25:12.251925 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:25:14 crc kubenswrapper[4835]: I1002 11:25:14.278168 4835 generic.go:334] "Generic (PLEG): container finished" podID="5292edd5-df1a-4ab0-a2cc-0cf5846f60f8" containerID="275aad0297460f7c2885f2d92f56717591343c14a05d5ccc4fa6d32bfa99c4ee" exitCode=0 Oct 02 11:25:14 crc kubenswrapper[4835]: I1002 11:25:14.278250 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" event={"ID":"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8","Type":"ContainerDied","Data":"275aad0297460f7c2885f2d92f56717591343c14a05d5ccc4fa6d32bfa99c4ee"} Oct 02 11:25:15 crc kubenswrapper[4835]: I1002 11:25:15.633356 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:15 crc kubenswrapper[4835]: I1002 11:25:15.753656 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-ssh-key\") pod \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " Oct 02 11:25:15 crc kubenswrapper[4835]: I1002 11:25:15.753794 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-inventory\") pod \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " Oct 02 11:25:15 crc kubenswrapper[4835]: I1002 11:25:15.753831 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brknq\" (UniqueName: \"kubernetes.io/projected/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-kube-api-access-brknq\") pod \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\" (UID: \"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8\") " Oct 02 11:25:15 crc kubenswrapper[4835]: I1002 11:25:15.759752 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-kube-api-access-brknq" (OuterVolumeSpecName: "kube-api-access-brknq") pod "5292edd5-df1a-4ab0-a2cc-0cf5846f60f8" (UID: "5292edd5-df1a-4ab0-a2cc-0cf5846f60f8"). InnerVolumeSpecName "kube-api-access-brknq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:25:15 crc kubenswrapper[4835]: I1002 11:25:15.778491 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-inventory" (OuterVolumeSpecName: "inventory") pod "5292edd5-df1a-4ab0-a2cc-0cf5846f60f8" (UID: "5292edd5-df1a-4ab0-a2cc-0cf5846f60f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:25:15 crc kubenswrapper[4835]: I1002 11:25:15.779980 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5292edd5-df1a-4ab0-a2cc-0cf5846f60f8" (UID: "5292edd5-df1a-4ab0-a2cc-0cf5846f60f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:25:15 crc kubenswrapper[4835]: I1002 11:25:15.855630 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:15 crc kubenswrapper[4835]: I1002 11:25:15.855896 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:15 crc kubenswrapper[4835]: I1002 11:25:15.855968 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brknq\" (UniqueName: \"kubernetes.io/projected/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8-kube-api-access-brknq\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.298607 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" event={"ID":"5292edd5-df1a-4ab0-a2cc-0cf5846f60f8","Type":"ContainerDied","Data":"204375b1f78de054b80ef6022f4f47b8ce35770489fc9d2e3565d5db59d63f5c"} Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.298656 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="204375b1f78de054b80ef6022f4f47b8ce35770489fc9d2e3565d5db59d63f5c" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.298686 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.378801 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz"] Oct 02 11:25:16 crc kubenswrapper[4835]: E1002 11:25:16.379280 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5292edd5-df1a-4ab0-a2cc-0cf5846f60f8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.379307 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5292edd5-df1a-4ab0-a2cc-0cf5846f60f8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.379541 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5292edd5-df1a-4ab0-a2cc-0cf5846f60f8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.380325 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.382077 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.382531 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.382860 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.383014 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.388137 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz"] Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.465610 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjrwh\" (UniqueName: \"kubernetes.io/projected/88def01a-a300-40c2-bd09-e6c4ac838101-kube-api-access-kjrwh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.465715 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.465788 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.567003 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.567187 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjrwh\" (UniqueName: \"kubernetes.io/projected/88def01a-a300-40c2-bd09-e6c4ac838101-kube-api-access-kjrwh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.567278 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.572269 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.573032 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.588968 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjrwh\" (UniqueName: \"kubernetes.io/projected/88def01a-a300-40c2-bd09-e6c4ac838101-kube-api-access-kjrwh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:25:16 crc kubenswrapper[4835]: I1002 11:25:16.697079 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:25:17 crc kubenswrapper[4835]: I1002 11:25:17.203012 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz"] Oct 02 11:25:17 crc kubenswrapper[4835]: I1002 11:25:17.308165 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" event={"ID":"88def01a-a300-40c2-bd09-e6c4ac838101","Type":"ContainerStarted","Data":"3022c57f6273d56c61e5fc5b3c132b5b42e523dfd6b6355b9094ad76515d9ff4"} Oct 02 11:25:18 crc kubenswrapper[4835]: I1002 11:25:18.318198 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" event={"ID":"88def01a-a300-40c2-bd09-e6c4ac838101","Type":"ContainerStarted","Data":"0f643c2ba99f6d78899f4ae4441e5bdc63aa96293f114905e0fa054d3147b5e5"} Oct 02 11:25:18 crc kubenswrapper[4835]: I1002 11:25:18.345017 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" podStartSLOduration=1.662948046 podStartE2EDuration="2.344993223s" podCreationTimestamp="2025-10-02 11:25:16 +0000 UTC" firstStartedPulling="2025-10-02 11:25:17.206463112 +0000 UTC m=+1793.766370683" lastFinishedPulling="2025-10-02 11:25:17.888508279 +0000 UTC m=+1794.448415860" observedRunningTime="2025-10-02 11:25:18.33598353 +0000 UTC m=+1794.895891111" watchObservedRunningTime="2025-10-02 11:25:18.344993223 +0000 UTC m=+1794.904900814" Oct 02 11:25:22 crc kubenswrapper[4835]: I1002 11:25:22.042240 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rk79x"] Oct 02 11:25:22 crc kubenswrapper[4835]: I1002 11:25:22.049450 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rk79x"] Oct 02 11:25:22 crc kubenswrapper[4835]: I1002 11:25:22.262493 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e627031e-a0d4-459c-9250-bfdcf645d133" path="/var/lib/kubelet/pods/e627031e-a0d4-459c-9250-bfdcf645d133/volumes" Oct 02 11:25:23 crc kubenswrapper[4835]: I1002 11:25:23.033129 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-2vdk6"] Oct 02 11:25:23 crc kubenswrapper[4835]: I1002 11:25:23.041966 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-2vdk6"] Oct 02 11:25:24 crc kubenswrapper[4835]: I1002 11:25:24.263899 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc534d7c-ef08-44e5-b56d-d3421477c51d" path="/var/lib/kubelet/pods/fc534d7c-ef08-44e5-b56d-d3421477c51d/volumes" Oct 02 11:25:25 crc kubenswrapper[4835]: I1002 11:25:25.034631 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5nwmn"] Oct 02 11:25:25 crc kubenswrapper[4835]: I1002 11:25:25.045505 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5nwmn"] Oct 02 11:25:26 crc kubenswrapper[4835]: I1002 11:25:26.026941 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4p7h4"] Oct 02 11:25:26 crc kubenswrapper[4835]: I1002 11:25:26.035623 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4p7h4"] Oct 02 11:25:26 crc kubenswrapper[4835]: I1002 11:25:26.266056 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be5aafd-caf7-4d3e-a651-774f264ff938" path="/var/lib/kubelet/pods/0be5aafd-caf7-4d3e-a651-774f264ff938/volumes" Oct 02 11:25:26 crc kubenswrapper[4835]: I1002 11:25:26.266829 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa9590e-e7df-41eb-9dd8-d22a2f382b94" path="/var/lib/kubelet/pods/7aa9590e-e7df-41eb-9dd8-d22a2f382b94/volumes" Oct 02 11:25:27 crc kubenswrapper[4835]: I1002 11:25:27.251607 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:25:27 crc kubenswrapper[4835]: E1002 11:25:27.252172 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:25:28 crc kubenswrapper[4835]: I1002 11:25:28.031101 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-glqq5"] Oct 02 11:25:28 crc kubenswrapper[4835]: I1002 11:25:28.040201 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-glqq5"] Oct 02 11:25:28 crc kubenswrapper[4835]: I1002 11:25:28.266922 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc6d4089-7586-4a01-a18e-7cdb9da91783" path="/var/lib/kubelet/pods/fc6d4089-7586-4a01-a18e-7cdb9da91783/volumes" Oct 02 11:25:35 crc kubenswrapper[4835]: I1002 11:25:35.030965 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8999-account-create-xt68d"] Oct 02 11:25:35 crc kubenswrapper[4835]: I1002 11:25:35.042357 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8999-account-create-xt68d"] Oct 02 11:25:36 crc kubenswrapper[4835]: I1002 11:25:36.027509 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7cd3-account-create-bkdqw"] Oct 02 11:25:36 crc kubenswrapper[4835]: I1002 11:25:36.036531 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7cd3-account-create-bkdqw"] Oct 02 11:25:36 crc kubenswrapper[4835]: I1002 11:25:36.271249 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="806ec124-c88e-471a-891f-a76296deb62a" path="/var/lib/kubelet/pods/806ec124-c88e-471a-891f-a76296deb62a/volumes" Oct 02 11:25:36 crc kubenswrapper[4835]: I1002 11:25:36.272512 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de48fc0-9dd0-4a25-894e-e27bca99f97f" path="/var/lib/kubelet/pods/9de48fc0-9dd0-4a25-894e-e27bca99f97f/volumes" Oct 02 11:25:37 crc kubenswrapper[4835]: I1002 11:25:37.030849 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cd62-account-create-jfhjr"] Oct 02 11:25:37 crc kubenswrapper[4835]: I1002 11:25:37.039145 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cd62-account-create-jfhjr"] Oct 02 11:25:38 crc kubenswrapper[4835]: I1002 11:25:38.261090 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca21d32-62e0-4438-85cc-5a60c3933915" path="/var/lib/kubelet/pods/7ca21d32-62e0-4438-85cc-5a60c3933915/volumes" Oct 02 11:25:42 crc kubenswrapper[4835]: I1002 11:25:42.251982 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:25:42 crc kubenswrapper[4835]: E1002 11:25:42.252573 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:25:53 crc kubenswrapper[4835]: I1002 11:25:53.252339 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:25:53 crc kubenswrapper[4835]: E1002 11:25:53.253274 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:25:58 crc kubenswrapper[4835]: I1002 11:25:58.048723 4835 scope.go:117] "RemoveContainer" containerID="484c49d26816d65fc222f1890a428d13caa81de3af992b8148ce1646a1daa389" Oct 02 11:25:58 crc kubenswrapper[4835]: I1002 11:25:58.070497 4835 scope.go:117] "RemoveContainer" containerID="fdee90d3ebdc9eb5b237720a313faf11147d95931b3fc5f641316a2ea9cb676b" Oct 02 11:25:58 crc kubenswrapper[4835]: I1002 11:25:58.091424 4835 scope.go:117] "RemoveContainer" containerID="437a0cbe29e379a31f823500d05a253be92ecd244eea66fdb4f15d5f0ebe61a0" Oct 02 11:25:58 crc kubenswrapper[4835]: I1002 11:25:58.153066 4835 scope.go:117] "RemoveContainer" containerID="263b248f928928c12d3e51bd98da87292f41d4d0eb1b0152d162c77501978a29" Oct 02 11:25:58 crc kubenswrapper[4835]: I1002 11:25:58.228136 4835 scope.go:117] "RemoveContainer" containerID="09ab146fade6566a9fb02d4c99966c2be19fec660d446ec9fd8cec25b4263652" Oct 02 11:25:58 crc kubenswrapper[4835]: I1002 11:25:58.256470 4835 scope.go:117] "RemoveContainer" containerID="0e5d8f9241c40fb080ed6dcd04f01be52e988873cc2fd40b04c9695a727fc99f" Oct 02 11:25:58 crc kubenswrapper[4835]: I1002 11:25:58.311383 4835 scope.go:117] "RemoveContainer" containerID="da9588ed54e9c84dc41cd58113b37f1f640de3761ce9666c815c4c3fade6d142" Oct 02 11:25:58 crc kubenswrapper[4835]: I1002 11:25:58.348305 4835 scope.go:117] "RemoveContainer" containerID="5f94f9e7417bf348c29cadfb2e873a282041a1a30fd5ae756807f4d33b8acdb5" Oct 02 11:25:58 crc kubenswrapper[4835]: I1002 11:25:58.364673 4835 scope.go:117] "RemoveContainer" containerID="9ee76de918bc4e19b8144d3bd91d16557d815e1f1a8213cead7b8641ab72641f" Oct 02 11:25:58 crc kubenswrapper[4835]: I1002 11:25:58.384425 4835 scope.go:117] "RemoveContainer" containerID="cd596dd7792ef11c94784376d1552c2863357281364a72de1abc0804776a24a8" Oct 02 11:26:04 crc kubenswrapper[4835]: I1002 11:26:04.264177 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:26:04 crc kubenswrapper[4835]: E1002 11:26:04.265472 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:26:05 crc kubenswrapper[4835]: I1002 11:26:05.051165 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-rk9m4"] Oct 02 11:26:05 crc kubenswrapper[4835]: I1002 11:26:05.060332 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-rk9m4"] Oct 02 11:26:06 crc kubenswrapper[4835]: I1002 11:26:06.260826 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af74a3af-19d5-45bf-b366-5e79fe901079" path="/var/lib/kubelet/pods/af74a3af-19d5-45bf-b366-5e79fe901079/volumes" Oct 02 11:26:10 crc kubenswrapper[4835]: I1002 11:26:10.802352 4835 generic.go:334] "Generic (PLEG): container finished" podID="88def01a-a300-40c2-bd09-e6c4ac838101" containerID="0f643c2ba99f6d78899f4ae4441e5bdc63aa96293f114905e0fa054d3147b5e5" exitCode=2 Oct 02 11:26:10 crc kubenswrapper[4835]: I1002 11:26:10.802397 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" event={"ID":"88def01a-a300-40c2-bd09-e6c4ac838101","Type":"ContainerDied","Data":"0f643c2ba99f6d78899f4ae4441e5bdc63aa96293f114905e0fa054d3147b5e5"} Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.187296 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.306461 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-inventory\") pod \"88def01a-a300-40c2-bd09-e6c4ac838101\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.306528 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjrwh\" (UniqueName: \"kubernetes.io/projected/88def01a-a300-40c2-bd09-e6c4ac838101-kube-api-access-kjrwh\") pod \"88def01a-a300-40c2-bd09-e6c4ac838101\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.306561 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-ssh-key\") pod \"88def01a-a300-40c2-bd09-e6c4ac838101\" (UID: \"88def01a-a300-40c2-bd09-e6c4ac838101\") " Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.312731 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88def01a-a300-40c2-bd09-e6c4ac838101-kube-api-access-kjrwh" (OuterVolumeSpecName: "kube-api-access-kjrwh") pod "88def01a-a300-40c2-bd09-e6c4ac838101" (UID: "88def01a-a300-40c2-bd09-e6c4ac838101"). InnerVolumeSpecName "kube-api-access-kjrwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.339588 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "88def01a-a300-40c2-bd09-e6c4ac838101" (UID: "88def01a-a300-40c2-bd09-e6c4ac838101"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.352187 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-inventory" (OuterVolumeSpecName: "inventory") pod "88def01a-a300-40c2-bd09-e6c4ac838101" (UID: "88def01a-a300-40c2-bd09-e6c4ac838101"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.408263 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.408293 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjrwh\" (UniqueName: \"kubernetes.io/projected/88def01a-a300-40c2-bd09-e6c4ac838101-kube-api-access-kjrwh\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.408306 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88def01a-a300-40c2-bd09-e6c4ac838101-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.821432 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" event={"ID":"88def01a-a300-40c2-bd09-e6c4ac838101","Type":"ContainerDied","Data":"3022c57f6273d56c61e5fc5b3c132b5b42e523dfd6b6355b9094ad76515d9ff4"} Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.821470 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3022c57f6273d56c61e5fc5b3c132b5b42e523dfd6b6355b9094ad76515d9ff4" Oct 02 11:26:12 crc kubenswrapper[4835]: I1002 11:26:12.821516 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz" Oct 02 11:26:15 crc kubenswrapper[4835]: I1002 11:26:15.252654 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:26:15 crc kubenswrapper[4835]: E1002 11:26:15.255156 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:26:19 crc kubenswrapper[4835]: I1002 11:26:19.048109 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7qhlf"] Oct 02 11:26:19 crc kubenswrapper[4835]: I1002 11:26:19.058729 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7qhlf"] Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.032562 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r"] Oct 02 11:26:20 crc kubenswrapper[4835]: E1002 11:26:20.032949 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88def01a-a300-40c2-bd09-e6c4ac838101" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.032962 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="88def01a-a300-40c2-bd09-e6c4ac838101" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.033197 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="88def01a-a300-40c2-bd09-e6c4ac838101" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.033903 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.036820 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.040637 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.040640 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.041017 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.066585 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r"] Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.159201 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8j66r\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.159424 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8j66r\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.159471 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b8dd\" (UniqueName: \"kubernetes.io/projected/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-kube-api-access-8b8dd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8j66r\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.260775 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8j66r\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.261073 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b8dd\" (UniqueName: \"kubernetes.io/projected/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-kube-api-access-8b8dd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8j66r\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.261248 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8j66r\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.264541 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2c9e49-6700-488d-bbcd-46812b7bf134" path="/var/lib/kubelet/pods/ff2c9e49-6700-488d-bbcd-46812b7bf134/volumes" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.267269 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8j66r\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.273678 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8j66r\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.282747 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b8dd\" (UniqueName: \"kubernetes.io/projected/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-kube-api-access-8b8dd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8j66r\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.368827 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:26:20 crc kubenswrapper[4835]: I1002 11:26:20.925072 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r"] Oct 02 11:26:21 crc kubenswrapper[4835]: I1002 11:26:21.895568 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" event={"ID":"55e8b9d7-69d4-4cc0-8209-5df62efe58a4","Type":"ContainerStarted","Data":"56c23e2bc1ef56e179b3b33e5c086edb500c25c90889b5e24fd64574e01a47f3"} Oct 02 11:26:21 crc kubenswrapper[4835]: I1002 11:26:21.896208 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" event={"ID":"55e8b9d7-69d4-4cc0-8209-5df62efe58a4","Type":"ContainerStarted","Data":"b45028e9f194f94cadab6dd1feee4a897f929f12d453c68db2bbe118c2b438b3"} Oct 02 11:26:29 crc kubenswrapper[4835]: I1002 11:26:29.252309 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:26:29 crc kubenswrapper[4835]: E1002 11:26:29.253402 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:26:42 crc kubenswrapper[4835]: I1002 11:26:42.060157 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" podStartSLOduration=21.379468451 podStartE2EDuration="22.060128208s" podCreationTimestamp="2025-10-02 11:26:20 +0000 UTC" firstStartedPulling="2025-10-02 11:26:20.940444214 +0000 UTC m=+1857.500351795" lastFinishedPulling="2025-10-02 11:26:21.621103931 +0000 UTC m=+1858.181011552" observedRunningTime="2025-10-02 11:26:21.916338822 +0000 UTC m=+1858.476246423" watchObservedRunningTime="2025-10-02 11:26:42.060128208 +0000 UTC m=+1878.620035819" Oct 02 11:26:42 crc kubenswrapper[4835]: I1002 11:26:42.061749 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-fcjfv"] Oct 02 11:26:42 crc kubenswrapper[4835]: I1002 11:26:42.072516 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-fcjfv"] Oct 02 11:26:42 crc kubenswrapper[4835]: I1002 11:26:42.269042 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28eb8749-6de4-4c76-928c-0f35ce4c378a" path="/var/lib/kubelet/pods/28eb8749-6de4-4c76-928c-0f35ce4c378a/volumes" Oct 02 11:26:43 crc kubenswrapper[4835]: I1002 11:26:43.047654 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7mzxn"] Oct 02 11:26:43 crc kubenswrapper[4835]: I1002 11:26:43.055050 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7mzxn"] Oct 02 11:26:44 crc kubenswrapper[4835]: I1002 11:26:44.257470 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:26:44 crc kubenswrapper[4835]: E1002 11:26:44.258189 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:26:44 crc kubenswrapper[4835]: I1002 11:26:44.264015 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f1cca3d-b017-4cc3-875e-41a75f8ee14a" path="/var/lib/kubelet/pods/3f1cca3d-b017-4cc3-875e-41a75f8ee14a/volumes" Oct 02 11:26:58 crc kubenswrapper[4835]: I1002 11:26:58.544290 4835 scope.go:117] "RemoveContainer" containerID="f854c0448719ab5b98713d841698bc7c1e7a9721059c23de721d27109c1336d1" Oct 02 11:26:58 crc kubenswrapper[4835]: I1002 11:26:58.585042 4835 scope.go:117] "RemoveContainer" containerID="bf73a1aefa7dcb2c77acb62ad908faee1449eaf34b1a8537c96d8a50b27201fe" Oct 02 11:26:58 crc kubenswrapper[4835]: I1002 11:26:58.641162 4835 scope.go:117] "RemoveContainer" containerID="acc301e91bc70fe750761770f498da7d676fe40a139bbfad72da3abe10357c37" Oct 02 11:26:58 crc kubenswrapper[4835]: I1002 11:26:58.675380 4835 scope.go:117] "RemoveContainer" containerID="c1d1c1bc0be760576a311187ca4f80eeec19da3191c231e3a0a2983e09e1f0ea" Oct 02 11:26:59 crc kubenswrapper[4835]: I1002 11:26:59.252071 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:26:59 crc kubenswrapper[4835]: E1002 11:26:59.252497 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:27:02 crc kubenswrapper[4835]: I1002 11:27:02.246060 4835 generic.go:334] "Generic (PLEG): container finished" podID="55e8b9d7-69d4-4cc0-8209-5df62efe58a4" containerID="56c23e2bc1ef56e179b3b33e5c086edb500c25c90889b5e24fd64574e01a47f3" exitCode=0 Oct 02 11:27:02 crc kubenswrapper[4835]: I1002 11:27:02.246121 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" event={"ID":"55e8b9d7-69d4-4cc0-8209-5df62efe58a4","Type":"ContainerDied","Data":"56c23e2bc1ef56e179b3b33e5c086edb500c25c90889b5e24fd64574e01a47f3"} Oct 02 11:27:03 crc kubenswrapper[4835]: I1002 11:27:03.619236 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:27:03 crc kubenswrapper[4835]: I1002 11:27:03.694062 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-ssh-key\") pod \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " Oct 02 11:27:03 crc kubenswrapper[4835]: I1002 11:27:03.694342 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-inventory\") pod \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " Oct 02 11:27:03 crc kubenswrapper[4835]: I1002 11:27:03.694375 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b8dd\" (UniqueName: \"kubernetes.io/projected/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-kube-api-access-8b8dd\") pod \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\" (UID: \"55e8b9d7-69d4-4cc0-8209-5df62efe58a4\") " Oct 02 11:27:03 crc kubenswrapper[4835]: I1002 11:27:03.716853 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-kube-api-access-8b8dd" (OuterVolumeSpecName: "kube-api-access-8b8dd") pod "55e8b9d7-69d4-4cc0-8209-5df62efe58a4" (UID: "55e8b9d7-69d4-4cc0-8209-5df62efe58a4"). InnerVolumeSpecName "kube-api-access-8b8dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:27:03 crc kubenswrapper[4835]: I1002 11:27:03.721480 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-inventory" (OuterVolumeSpecName: "inventory") pod "55e8b9d7-69d4-4cc0-8209-5df62efe58a4" (UID: "55e8b9d7-69d4-4cc0-8209-5df62efe58a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:27:03 crc kubenswrapper[4835]: I1002 11:27:03.738263 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "55e8b9d7-69d4-4cc0-8209-5df62efe58a4" (UID: "55e8b9d7-69d4-4cc0-8209-5df62efe58a4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:27:03 crc kubenswrapper[4835]: I1002 11:27:03.796906 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:03 crc kubenswrapper[4835]: I1002 11:27:03.796950 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:03 crc kubenswrapper[4835]: I1002 11:27:03.796970 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b8dd\" (UniqueName: \"kubernetes.io/projected/55e8b9d7-69d4-4cc0-8209-5df62efe58a4-kube-api-access-8b8dd\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.265071 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" event={"ID":"55e8b9d7-69d4-4cc0-8209-5df62efe58a4","Type":"ContainerDied","Data":"b45028e9f194f94cadab6dd1feee4a897f929f12d453c68db2bbe118c2b438b3"} Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.265151 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.265257 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b45028e9f194f94cadab6dd1feee4a897f929f12d453c68db2bbe118c2b438b3" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.374404 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wjhs2"] Oct 02 11:27:04 crc kubenswrapper[4835]: E1002 11:27:04.374942 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e8b9d7-69d4-4cc0-8209-5df62efe58a4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.374969 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e8b9d7-69d4-4cc0-8209-5df62efe58a4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.375379 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e8b9d7-69d4-4cc0-8209-5df62efe58a4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.376302 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.382605 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.382844 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.382987 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.383926 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wjhs2"] Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.385474 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.407477 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-wjhs2\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.407601 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szvcg\" (UniqueName: \"kubernetes.io/projected/3acaf2eb-6424-41a4-94d0-51edaa41fb24-kube-api-access-szvcg\") pod \"ssh-known-hosts-edpm-deployment-wjhs2\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.407690 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-wjhs2\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.509830 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szvcg\" (UniqueName: \"kubernetes.io/projected/3acaf2eb-6424-41a4-94d0-51edaa41fb24-kube-api-access-szvcg\") pod \"ssh-known-hosts-edpm-deployment-wjhs2\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.510002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-wjhs2\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.510048 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-wjhs2\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.514424 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-wjhs2\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.515109 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-wjhs2\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.527299 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szvcg\" (UniqueName: \"kubernetes.io/projected/3acaf2eb-6424-41a4-94d0-51edaa41fb24-kube-api-access-szvcg\") pod \"ssh-known-hosts-edpm-deployment-wjhs2\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:04 crc kubenswrapper[4835]: I1002 11:27:04.696864 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:05 crc kubenswrapper[4835]: I1002 11:27:05.206724 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wjhs2"] Oct 02 11:27:05 crc kubenswrapper[4835]: I1002 11:27:05.274245 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" event={"ID":"3acaf2eb-6424-41a4-94d0-51edaa41fb24","Type":"ContainerStarted","Data":"9800bebb5e5401d7d37e0c39ce2f8499f28aa10487b55b83b680c887f4b4242e"} Oct 02 11:27:06 crc kubenswrapper[4835]: I1002 11:27:06.283862 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" event={"ID":"3acaf2eb-6424-41a4-94d0-51edaa41fb24","Type":"ContainerStarted","Data":"f45aa9349088b7fcf2497145f2faa3534f5edaa711bc059dd5fc7f663f033d63"} Oct 02 11:27:06 crc kubenswrapper[4835]: I1002 11:27:06.311770 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" podStartSLOduration=1.842027662 podStartE2EDuration="2.311747911s" podCreationTimestamp="2025-10-02 11:27:04 +0000 UTC" firstStartedPulling="2025-10-02 11:27:05.215864831 +0000 UTC m=+1901.775772412" lastFinishedPulling="2025-10-02 11:27:05.68558508 +0000 UTC m=+1902.245492661" observedRunningTime="2025-10-02 11:27:06.303392858 +0000 UTC m=+1902.863300459" watchObservedRunningTime="2025-10-02 11:27:06.311747911 +0000 UTC m=+1902.871655492" Oct 02 11:27:12 crc kubenswrapper[4835]: I1002 11:27:12.251541 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:27:12 crc kubenswrapper[4835]: E1002 11:27:12.252409 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:27:12 crc kubenswrapper[4835]: I1002 11:27:12.338650 4835 generic.go:334] "Generic (PLEG): container finished" podID="3acaf2eb-6424-41a4-94d0-51edaa41fb24" containerID="f45aa9349088b7fcf2497145f2faa3534f5edaa711bc059dd5fc7f663f033d63" exitCode=0 Oct 02 11:27:12 crc kubenswrapper[4835]: I1002 11:27:12.338704 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" event={"ID":"3acaf2eb-6424-41a4-94d0-51edaa41fb24","Type":"ContainerDied","Data":"f45aa9349088b7fcf2497145f2faa3534f5edaa711bc059dd5fc7f663f033d63"} Oct 02 11:27:13 crc kubenswrapper[4835]: I1002 11:27:13.738687 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:13 crc kubenswrapper[4835]: I1002 11:27:13.812841 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-inventory-0\") pod \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " Oct 02 11:27:13 crc kubenswrapper[4835]: I1002 11:27:13.813078 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-ssh-key-openstack-edpm-ipam\") pod \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " Oct 02 11:27:13 crc kubenswrapper[4835]: I1002 11:27:13.813166 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szvcg\" (UniqueName: \"kubernetes.io/projected/3acaf2eb-6424-41a4-94d0-51edaa41fb24-kube-api-access-szvcg\") pod \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\" (UID: \"3acaf2eb-6424-41a4-94d0-51edaa41fb24\") " Oct 02 11:27:13 crc kubenswrapper[4835]: I1002 11:27:13.819350 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3acaf2eb-6424-41a4-94d0-51edaa41fb24-kube-api-access-szvcg" (OuterVolumeSpecName: "kube-api-access-szvcg") pod "3acaf2eb-6424-41a4-94d0-51edaa41fb24" (UID: "3acaf2eb-6424-41a4-94d0-51edaa41fb24"). InnerVolumeSpecName "kube-api-access-szvcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:27:13 crc kubenswrapper[4835]: I1002 11:27:13.841110 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3acaf2eb-6424-41a4-94d0-51edaa41fb24" (UID: "3acaf2eb-6424-41a4-94d0-51edaa41fb24"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:27:13 crc kubenswrapper[4835]: I1002 11:27:13.846575 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3acaf2eb-6424-41a4-94d0-51edaa41fb24" (UID: "3acaf2eb-6424-41a4-94d0-51edaa41fb24"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:27:13 crc kubenswrapper[4835]: I1002 11:27:13.915853 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:13 crc kubenswrapper[4835]: I1002 11:27:13.915906 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szvcg\" (UniqueName: \"kubernetes.io/projected/3acaf2eb-6424-41a4-94d0-51edaa41fb24-kube-api-access-szvcg\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:13 crc kubenswrapper[4835]: I1002 11:27:13.915916 4835 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3acaf2eb-6424-41a4-94d0-51edaa41fb24-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.354518 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" event={"ID":"3acaf2eb-6424-41a4-94d0-51edaa41fb24","Type":"ContainerDied","Data":"9800bebb5e5401d7d37e0c39ce2f8499f28aa10487b55b83b680c887f4b4242e"} Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.354788 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9800bebb5e5401d7d37e0c39ce2f8499f28aa10487b55b83b680c887f4b4242e" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.354562 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wjhs2" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.442697 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4"] Oct 02 11:27:14 crc kubenswrapper[4835]: E1002 11:27:14.443144 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acaf2eb-6424-41a4-94d0-51edaa41fb24" containerName="ssh-known-hosts-edpm-deployment" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.443164 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acaf2eb-6424-41a4-94d0-51edaa41fb24" containerName="ssh-known-hosts-edpm-deployment" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.468911 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3acaf2eb-6424-41a4-94d0-51edaa41fb24" containerName="ssh-known-hosts-edpm-deployment" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.469763 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4"] Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.469871 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.471910 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.473056 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.473070 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.473482 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.528472 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pvrc4\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.528647 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pvrc4\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.528707 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks8qk\" (UniqueName: \"kubernetes.io/projected/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-kube-api-access-ks8qk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pvrc4\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.630434 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pvrc4\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.630789 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pvrc4\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.630911 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks8qk\" (UniqueName: \"kubernetes.io/projected/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-kube-api-access-ks8qk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pvrc4\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.637943 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pvrc4\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.638249 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pvrc4\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.648452 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks8qk\" (UniqueName: \"kubernetes.io/projected/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-kube-api-access-ks8qk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pvrc4\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:14 crc kubenswrapper[4835]: I1002 11:27:14.796165 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:15 crc kubenswrapper[4835]: I1002 11:27:15.308091 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4"] Oct 02 11:27:15 crc kubenswrapper[4835]: I1002 11:27:15.363594 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" event={"ID":"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce","Type":"ContainerStarted","Data":"765fca1bb0a2bf552d0eaf40a7ae44085748c41d39ab07934a826560c3cee749"} Oct 02 11:27:16 crc kubenswrapper[4835]: I1002 11:27:16.373592 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" event={"ID":"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce","Type":"ContainerStarted","Data":"83469bf6d2c0c4eb8d12decd85fc17959a9e3a98b90939a664ed22c7af41b69d"} Oct 02 11:27:16 crc kubenswrapper[4835]: I1002 11:27:16.393344 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" podStartSLOduration=1.886261095 podStartE2EDuration="2.39332469s" podCreationTimestamp="2025-10-02 11:27:14 +0000 UTC" firstStartedPulling="2025-10-02 11:27:15.314575589 +0000 UTC m=+1911.874483170" lastFinishedPulling="2025-10-02 11:27:15.821639184 +0000 UTC m=+1912.381546765" observedRunningTime="2025-10-02 11:27:16.388601213 +0000 UTC m=+1912.948508804" watchObservedRunningTime="2025-10-02 11:27:16.39332469 +0000 UTC m=+1912.953232471" Oct 02 11:27:24 crc kubenswrapper[4835]: I1002 11:27:24.438910 4835 generic.go:334] "Generic (PLEG): container finished" podID="580f5f3f-c372-4b1e-8806-8e3ab55cb8ce" containerID="83469bf6d2c0c4eb8d12decd85fc17959a9e3a98b90939a664ed22c7af41b69d" exitCode=0 Oct 02 11:27:24 crc kubenswrapper[4835]: I1002 11:27:24.438988 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" event={"ID":"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce","Type":"ContainerDied","Data":"83469bf6d2c0c4eb8d12decd85fc17959a9e3a98b90939a664ed22c7af41b69d"} Oct 02 11:27:25 crc kubenswrapper[4835]: I1002 11:27:25.838406 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:25 crc kubenswrapper[4835]: I1002 11:27:25.934507 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks8qk\" (UniqueName: \"kubernetes.io/projected/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-kube-api-access-ks8qk\") pod \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " Oct 02 11:27:25 crc kubenswrapper[4835]: I1002 11:27:25.934559 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-ssh-key\") pod \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " Oct 02 11:27:25 crc kubenswrapper[4835]: I1002 11:27:25.934707 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-inventory\") pod \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\" (UID: \"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce\") " Oct 02 11:27:25 crc kubenswrapper[4835]: I1002 11:27:25.960721 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-kube-api-access-ks8qk" (OuterVolumeSpecName: "kube-api-access-ks8qk") pod "580f5f3f-c372-4b1e-8806-8e3ab55cb8ce" (UID: "580f5f3f-c372-4b1e-8806-8e3ab55cb8ce"). InnerVolumeSpecName "kube-api-access-ks8qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:27:25 crc kubenswrapper[4835]: I1002 11:27:25.998534 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "580f5f3f-c372-4b1e-8806-8e3ab55cb8ce" (UID: "580f5f3f-c372-4b1e-8806-8e3ab55cb8ce"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.007758 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-inventory" (OuterVolumeSpecName: "inventory") pod "580f5f3f-c372-4b1e-8806-8e3ab55cb8ce" (UID: "580f5f3f-c372-4b1e-8806-8e3ab55cb8ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.040671 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks8qk\" (UniqueName: \"kubernetes.io/projected/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-kube-api-access-ks8qk\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.040721 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.040735 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.251666 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:27:26 crc kubenswrapper[4835]: E1002 11:27:26.252055 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.462105 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" event={"ID":"580f5f3f-c372-4b1e-8806-8e3ab55cb8ce","Type":"ContainerDied","Data":"765fca1bb0a2bf552d0eaf40a7ae44085748c41d39ab07934a826560c3cee749"} Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.462151 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="765fca1bb0a2bf552d0eaf40a7ae44085748c41d39ab07934a826560c3cee749" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.462156 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.513500 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r"] Oct 02 11:27:26 crc kubenswrapper[4835]: E1002 11:27:26.514137 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580f5f3f-c372-4b1e-8806-8e3ab55cb8ce" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.514234 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="580f5f3f-c372-4b1e-8806-8e3ab55cb8ce" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.514513 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="580f5f3f-c372-4b1e-8806-8e3ab55cb8ce" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.515367 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.517740 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.517927 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.518465 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.519773 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.532020 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r"] Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.651620 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.651962 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.652443 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dsqj\" (UniqueName: \"kubernetes.io/projected/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-kube-api-access-9dsqj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.754371 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dsqj\" (UniqueName: \"kubernetes.io/projected/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-kube-api-access-9dsqj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.754489 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.754532 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.758608 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.758661 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.781810 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dsqj\" (UniqueName: \"kubernetes.io/projected/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-kube-api-access-9dsqj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:26 crc kubenswrapper[4835]: I1002 11:27:26.831418 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:27 crc kubenswrapper[4835]: I1002 11:27:27.059041 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5lchq"] Oct 02 11:27:27 crc kubenswrapper[4835]: I1002 11:27:27.070349 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5lchq"] Oct 02 11:27:27 crc kubenswrapper[4835]: I1002 11:27:27.399463 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r"] Oct 02 11:27:27 crc kubenswrapper[4835]: I1002 11:27:27.471949 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" event={"ID":"86e7d26c-be8d-4b84-90b5-58bdaedbfefc","Type":"ContainerStarted","Data":"b264186d58c0ed00ad06054d6a39196f61d21dac9bfc8210d708460ac5d0dd8d"} Oct 02 11:27:28 crc kubenswrapper[4835]: I1002 11:27:28.264439 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0793397-2746-4cf7-9285-15ae5d86ffd2" path="/var/lib/kubelet/pods/b0793397-2746-4cf7-9285-15ae5d86ffd2/volumes" Oct 02 11:27:28 crc kubenswrapper[4835]: I1002 11:27:28.483278 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" event={"ID":"86e7d26c-be8d-4b84-90b5-58bdaedbfefc","Type":"ContainerStarted","Data":"8764b4549cf14b981b55a4ddc32cb40abb407444fe61fa98133d231ee6bdf7eb"} Oct 02 11:27:28 crc kubenswrapper[4835]: I1002 11:27:28.504350 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" podStartSLOduration=1.860454479 podStartE2EDuration="2.504326365s" podCreationTimestamp="2025-10-02 11:27:26 +0000 UTC" firstStartedPulling="2025-10-02 11:27:27.409055923 +0000 UTC m=+1923.968963504" lastFinishedPulling="2025-10-02 11:27:28.052927809 +0000 UTC m=+1924.612835390" observedRunningTime="2025-10-02 11:27:28.500843534 +0000 UTC m=+1925.060751115" watchObservedRunningTime="2025-10-02 11:27:28.504326365 +0000 UTC m=+1925.064233976" Oct 02 11:27:37 crc kubenswrapper[4835]: I1002 11:27:37.556442 4835 generic.go:334] "Generic (PLEG): container finished" podID="86e7d26c-be8d-4b84-90b5-58bdaedbfefc" containerID="8764b4549cf14b981b55a4ddc32cb40abb407444fe61fa98133d231ee6bdf7eb" exitCode=0 Oct 02 11:27:37 crc kubenswrapper[4835]: I1002 11:27:37.556523 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" event={"ID":"86e7d26c-be8d-4b84-90b5-58bdaedbfefc","Type":"ContainerDied","Data":"8764b4549cf14b981b55a4ddc32cb40abb407444fe61fa98133d231ee6bdf7eb"} Oct 02 11:27:38 crc kubenswrapper[4835]: I1002 11:27:38.959708 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.074587 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-ssh-key\") pod \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.075625 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dsqj\" (UniqueName: \"kubernetes.io/projected/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-kube-api-access-9dsqj\") pod \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.075756 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-inventory\") pod \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\" (UID: \"86e7d26c-be8d-4b84-90b5-58bdaedbfefc\") " Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.096474 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-kube-api-access-9dsqj" (OuterVolumeSpecName: "kube-api-access-9dsqj") pod "86e7d26c-be8d-4b84-90b5-58bdaedbfefc" (UID: "86e7d26c-be8d-4b84-90b5-58bdaedbfefc"). InnerVolumeSpecName "kube-api-access-9dsqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.163386 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-inventory" (OuterVolumeSpecName: "inventory") pod "86e7d26c-be8d-4b84-90b5-58bdaedbfefc" (UID: "86e7d26c-be8d-4b84-90b5-58bdaedbfefc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.180501 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dsqj\" (UniqueName: \"kubernetes.io/projected/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-kube-api-access-9dsqj\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.180531 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.201284 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "86e7d26c-be8d-4b84-90b5-58bdaedbfefc" (UID: "86e7d26c-be8d-4b84-90b5-58bdaedbfefc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.252160 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:27:39 crc kubenswrapper[4835]: E1002 11:27:39.252752 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.282453 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86e7d26c-be8d-4b84-90b5-58bdaedbfefc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.574419 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" event={"ID":"86e7d26c-be8d-4b84-90b5-58bdaedbfefc","Type":"ContainerDied","Data":"b264186d58c0ed00ad06054d6a39196f61d21dac9bfc8210d708460ac5d0dd8d"} Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.574456 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b264186d58c0ed00ad06054d6a39196f61d21dac9bfc8210d708460ac5d0dd8d" Oct 02 11:27:39 crc kubenswrapper[4835]: I1002 11:27:39.574504 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r" Oct 02 11:27:53 crc kubenswrapper[4835]: I1002 11:27:53.252367 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:27:53 crc kubenswrapper[4835]: E1002 11:27:53.253154 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:27:58 crc kubenswrapper[4835]: I1002 11:27:58.793568 4835 scope.go:117] "RemoveContainer" containerID="84ea7e736a7ae556aab418997888bcb0aca4aa93feeb24a6a1da9ae4c078a246" Oct 02 11:28:08 crc kubenswrapper[4835]: I1002 11:28:08.252411 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:28:08 crc kubenswrapper[4835]: E1002 11:28:08.253250 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:28:23 crc kubenswrapper[4835]: I1002 11:28:23.252288 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:28:23 crc kubenswrapper[4835]: E1002 11:28:23.253094 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:28:38 crc kubenswrapper[4835]: I1002 11:28:38.252028 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:28:38 crc kubenswrapper[4835]: E1002 11:28:38.253047 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:28:50 crc kubenswrapper[4835]: I1002 11:28:50.252606 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:28:51 crc kubenswrapper[4835]: I1002 11:28:51.195969 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"73a074b76cb9c97dae601976ee29c575b71ca7b26055a59b51af688465758233"} Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.148523 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6"] Oct 02 11:30:00 crc kubenswrapper[4835]: E1002 11:30:00.149374 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86e7d26c-be8d-4b84-90b5-58bdaedbfefc" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.149388 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="86e7d26c-be8d-4b84-90b5-58bdaedbfefc" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.149549 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="86e7d26c-be8d-4b84-90b5-58bdaedbfefc" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.150138 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.151985 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.152015 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.173552 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6"] Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.301764 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ae9ae1d-0774-4232-9e42-75de03313a30-config-volume\") pod \"collect-profiles-29323410-ghnb6\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.301901 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6c9z\" (UniqueName: \"kubernetes.io/projected/1ae9ae1d-0774-4232-9e42-75de03313a30-kube-api-access-g6c9z\") pod \"collect-profiles-29323410-ghnb6\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.302012 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ae9ae1d-0774-4232-9e42-75de03313a30-secret-volume\") pod \"collect-profiles-29323410-ghnb6\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.403384 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ae9ae1d-0774-4232-9e42-75de03313a30-config-volume\") pod \"collect-profiles-29323410-ghnb6\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.403461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6c9z\" (UniqueName: \"kubernetes.io/projected/1ae9ae1d-0774-4232-9e42-75de03313a30-kube-api-access-g6c9z\") pod \"collect-profiles-29323410-ghnb6\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.403512 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ae9ae1d-0774-4232-9e42-75de03313a30-secret-volume\") pod \"collect-profiles-29323410-ghnb6\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.404504 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ae9ae1d-0774-4232-9e42-75de03313a30-config-volume\") pod \"collect-profiles-29323410-ghnb6\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.410002 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ae9ae1d-0774-4232-9e42-75de03313a30-secret-volume\") pod \"collect-profiles-29323410-ghnb6\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.420398 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6c9z\" (UniqueName: \"kubernetes.io/projected/1ae9ae1d-0774-4232-9e42-75de03313a30-kube-api-access-g6c9z\") pod \"collect-profiles-29323410-ghnb6\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.472172 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:00 crc kubenswrapper[4835]: I1002 11:30:00.899130 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6"] Oct 02 11:30:00 crc kubenswrapper[4835]: W1002 11:30:00.908387 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ae9ae1d_0774_4232_9e42_75de03313a30.slice/crio-6741633281c6711c13831fb143a92034e003afb3d17cf80bb3290bb488243f6b WatchSource:0}: Error finding container 6741633281c6711c13831fb143a92034e003afb3d17cf80bb3290bb488243f6b: Status 404 returned error can't find the container with id 6741633281c6711c13831fb143a92034e003afb3d17cf80bb3290bb488243f6b Oct 02 11:30:01 crc kubenswrapper[4835]: I1002 11:30:01.779337 4835 generic.go:334] "Generic (PLEG): container finished" podID="1ae9ae1d-0774-4232-9e42-75de03313a30" containerID="c9ef11aedbe4ab190486baa8be722d6c5bfe2abeec4b9933261883dda09d94bb" exitCode=0 Oct 02 11:30:01 crc kubenswrapper[4835]: I1002 11:30:01.779378 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" event={"ID":"1ae9ae1d-0774-4232-9e42-75de03313a30","Type":"ContainerDied","Data":"c9ef11aedbe4ab190486baa8be722d6c5bfe2abeec4b9933261883dda09d94bb"} Oct 02 11:30:01 crc kubenswrapper[4835]: I1002 11:30:01.779402 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" event={"ID":"1ae9ae1d-0774-4232-9e42-75de03313a30","Type":"ContainerStarted","Data":"6741633281c6711c13831fb143a92034e003afb3d17cf80bb3290bb488243f6b"} Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.107539 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.252711 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ae9ae1d-0774-4232-9e42-75de03313a30-secret-volume\") pod \"1ae9ae1d-0774-4232-9e42-75de03313a30\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.252792 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ae9ae1d-0774-4232-9e42-75de03313a30-config-volume\") pod \"1ae9ae1d-0774-4232-9e42-75de03313a30\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.253072 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6c9z\" (UniqueName: \"kubernetes.io/projected/1ae9ae1d-0774-4232-9e42-75de03313a30-kube-api-access-g6c9z\") pod \"1ae9ae1d-0774-4232-9e42-75de03313a30\" (UID: \"1ae9ae1d-0774-4232-9e42-75de03313a30\") " Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.254004 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ae9ae1d-0774-4232-9e42-75de03313a30-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ae9ae1d-0774-4232-9e42-75de03313a30" (UID: "1ae9ae1d-0774-4232-9e42-75de03313a30"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.258907 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae9ae1d-0774-4232-9e42-75de03313a30-kube-api-access-g6c9z" (OuterVolumeSpecName: "kube-api-access-g6c9z") pod "1ae9ae1d-0774-4232-9e42-75de03313a30" (UID: "1ae9ae1d-0774-4232-9e42-75de03313a30"). InnerVolumeSpecName "kube-api-access-g6c9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.258920 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae9ae1d-0774-4232-9e42-75de03313a30-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ae9ae1d-0774-4232-9e42-75de03313a30" (UID: "1ae9ae1d-0774-4232-9e42-75de03313a30"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.355090 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ae9ae1d-0774-4232-9e42-75de03313a30-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.355131 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ae9ae1d-0774-4232-9e42-75de03313a30-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.355141 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6c9z\" (UniqueName: \"kubernetes.io/projected/1ae9ae1d-0774-4232-9e42-75de03313a30-kube-api-access-g6c9z\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.796760 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" event={"ID":"1ae9ae1d-0774-4232-9e42-75de03313a30","Type":"ContainerDied","Data":"6741633281c6711c13831fb143a92034e003afb3d17cf80bb3290bb488243f6b"} Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.796802 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6741633281c6711c13831fb143a92034e003afb3d17cf80bb3290bb488243f6b" Oct 02 11:30:03 crc kubenswrapper[4835]: I1002 11:30:03.796838 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6" Oct 02 11:30:04 crc kubenswrapper[4835]: I1002 11:30:04.186303 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk"] Oct 02 11:30:04 crc kubenswrapper[4835]: I1002 11:30:04.192347 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323365-x2ggk"] Oct 02 11:30:04 crc kubenswrapper[4835]: I1002 11:30:04.269179 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="380763b9-fdb6-4b62-a8e0-c775708be101" path="/var/lib/kubelet/pods/380763b9-fdb6-4b62-a8e0-c775708be101/volumes" Oct 02 11:30:58 crc kubenswrapper[4835]: I1002 11:30:58.930055 4835 scope.go:117] "RemoveContainer" containerID="c4fbd1bc8691c74e725304953e117bf2a8b494624f56496be446332544b4ffe7" Oct 02 11:31:11 crc kubenswrapper[4835]: I1002 11:31:11.983729 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:31:11 crc kubenswrapper[4835]: I1002 11:31:11.984400 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:31:41 crc kubenswrapper[4835]: I1002 11:31:41.983857 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:31:41 crc kubenswrapper[4835]: I1002 11:31:41.984476 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:31:45 crc kubenswrapper[4835]: I1002 11:31:45.833159 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lr5fw"] Oct 02 11:31:45 crc kubenswrapper[4835]: E1002 11:31:45.834283 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae9ae1d-0774-4232-9e42-75de03313a30" containerName="collect-profiles" Oct 02 11:31:45 crc kubenswrapper[4835]: I1002 11:31:45.834303 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae9ae1d-0774-4232-9e42-75de03313a30" containerName="collect-profiles" Oct 02 11:31:45 crc kubenswrapper[4835]: I1002 11:31:45.834536 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae9ae1d-0774-4232-9e42-75de03313a30" containerName="collect-profiles" Oct 02 11:31:45 crc kubenswrapper[4835]: I1002 11:31:45.836131 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:45 crc kubenswrapper[4835]: I1002 11:31:45.847089 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lr5fw"] Oct 02 11:31:45 crc kubenswrapper[4835]: I1002 11:31:45.949965 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-utilities\") pod \"redhat-marketplace-lr5fw\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:45 crc kubenswrapper[4835]: I1002 11:31:45.950213 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-catalog-content\") pod \"redhat-marketplace-lr5fw\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:45 crc kubenswrapper[4835]: I1002 11:31:45.950327 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh6wq\" (UniqueName: \"kubernetes.io/projected/f9f3a288-407a-47bb-966e-ff98080b9123-kube-api-access-kh6wq\") pod \"redhat-marketplace-lr5fw\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:46 crc kubenswrapper[4835]: I1002 11:31:46.052643 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-catalog-content\") pod \"redhat-marketplace-lr5fw\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:46 crc kubenswrapper[4835]: I1002 11:31:46.052746 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh6wq\" (UniqueName: \"kubernetes.io/projected/f9f3a288-407a-47bb-966e-ff98080b9123-kube-api-access-kh6wq\") pod \"redhat-marketplace-lr5fw\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:46 crc kubenswrapper[4835]: I1002 11:31:46.052801 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-utilities\") pod \"redhat-marketplace-lr5fw\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:46 crc kubenswrapper[4835]: I1002 11:31:46.053571 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-utilities\") pod \"redhat-marketplace-lr5fw\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:46 crc kubenswrapper[4835]: I1002 11:31:46.053572 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-catalog-content\") pod \"redhat-marketplace-lr5fw\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:46 crc kubenswrapper[4835]: I1002 11:31:46.075320 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh6wq\" (UniqueName: \"kubernetes.io/projected/f9f3a288-407a-47bb-966e-ff98080b9123-kube-api-access-kh6wq\") pod \"redhat-marketplace-lr5fw\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:46 crc kubenswrapper[4835]: I1002 11:31:46.160070 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:46 crc kubenswrapper[4835]: I1002 11:31:46.604487 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lr5fw"] Oct 02 11:31:46 crc kubenswrapper[4835]: I1002 11:31:46.676618 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr5fw" event={"ID":"f9f3a288-407a-47bb-966e-ff98080b9123","Type":"ContainerStarted","Data":"7df4e01e4d8e60aa6a75a69063e32a1706df956207596830150d75189349e23e"} Oct 02 11:31:47 crc kubenswrapper[4835]: I1002 11:31:47.687134 4835 generic.go:334] "Generic (PLEG): container finished" podID="f9f3a288-407a-47bb-966e-ff98080b9123" containerID="15bf4dcf92ecb00604d3a7140bde3fe963a929af47fbdf7727fa8553224814e4" exitCode=0 Oct 02 11:31:47 crc kubenswrapper[4835]: I1002 11:31:47.687199 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr5fw" event={"ID":"f9f3a288-407a-47bb-966e-ff98080b9123","Type":"ContainerDied","Data":"15bf4dcf92ecb00604d3a7140bde3fe963a929af47fbdf7727fa8553224814e4"} Oct 02 11:31:47 crc kubenswrapper[4835]: I1002 11:31:47.688648 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:31:50 crc kubenswrapper[4835]: I1002 11:31:50.723079 4835 generic.go:334] "Generic (PLEG): container finished" podID="f9f3a288-407a-47bb-966e-ff98080b9123" containerID="1d416150be87fdd0a12ed6c32b4282fc7b11efa664b9b78321fb2776a0e4c168" exitCode=0 Oct 02 11:31:50 crc kubenswrapper[4835]: I1002 11:31:50.723205 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr5fw" event={"ID":"f9f3a288-407a-47bb-966e-ff98080b9123","Type":"ContainerDied","Data":"1d416150be87fdd0a12ed6c32b4282fc7b11efa664b9b78321fb2776a0e4c168"} Oct 02 11:31:52 crc kubenswrapper[4835]: I1002 11:31:52.743131 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr5fw" event={"ID":"f9f3a288-407a-47bb-966e-ff98080b9123","Type":"ContainerStarted","Data":"981c232917c5372f5958617b6368e3333049670c052fcb924c6164ef0d43e8e3"} Oct 02 11:31:52 crc kubenswrapper[4835]: I1002 11:31:52.762904 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lr5fw" podStartSLOduration=3.138212285 podStartE2EDuration="7.762887821s" podCreationTimestamp="2025-10-02 11:31:45 +0000 UTC" firstStartedPulling="2025-10-02 11:31:47.688385855 +0000 UTC m=+2184.248293436" lastFinishedPulling="2025-10-02 11:31:52.313061361 +0000 UTC m=+2188.872968972" observedRunningTime="2025-10-02 11:31:52.761597504 +0000 UTC m=+2189.321505105" watchObservedRunningTime="2025-10-02 11:31:52.762887821 +0000 UTC m=+2189.322795402" Oct 02 11:31:56 crc kubenswrapper[4835]: I1002 11:31:56.160738 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:56 crc kubenswrapper[4835]: I1002 11:31:56.162454 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:56 crc kubenswrapper[4835]: I1002 11:31:56.209680 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:57 crc kubenswrapper[4835]: I1002 11:31:57.847063 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:31:57 crc kubenswrapper[4835]: I1002 11:31:57.899585 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lr5fw"] Oct 02 11:31:59 crc kubenswrapper[4835]: I1002 11:31:59.813379 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lr5fw" podUID="f9f3a288-407a-47bb-966e-ff98080b9123" containerName="registry-server" containerID="cri-o://981c232917c5372f5958617b6368e3333049670c052fcb924c6164ef0d43e8e3" gracePeriod=2 Oct 02 11:32:00 crc kubenswrapper[4835]: I1002 11:32:00.822944 4835 generic.go:334] "Generic (PLEG): container finished" podID="f9f3a288-407a-47bb-966e-ff98080b9123" containerID="981c232917c5372f5958617b6368e3333049670c052fcb924c6164ef0d43e8e3" exitCode=0 Oct 02 11:32:00 crc kubenswrapper[4835]: I1002 11:32:00.823016 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr5fw" event={"ID":"f9f3a288-407a-47bb-966e-ff98080b9123","Type":"ContainerDied","Data":"981c232917c5372f5958617b6368e3333049670c052fcb924c6164ef0d43e8e3"} Oct 02 11:32:00 crc kubenswrapper[4835]: I1002 11:32:00.823453 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr5fw" event={"ID":"f9f3a288-407a-47bb-966e-ff98080b9123","Type":"ContainerDied","Data":"7df4e01e4d8e60aa6a75a69063e32a1706df956207596830150d75189349e23e"} Oct 02 11:32:00 crc kubenswrapper[4835]: I1002 11:32:00.823497 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7df4e01e4d8e60aa6a75a69063e32a1706df956207596830150d75189349e23e" Oct 02 11:32:00 crc kubenswrapper[4835]: I1002 11:32:00.838823 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:32:00 crc kubenswrapper[4835]: I1002 11:32:00.940020 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-utilities\") pod \"f9f3a288-407a-47bb-966e-ff98080b9123\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " Oct 02 11:32:00 crc kubenswrapper[4835]: I1002 11:32:00.940267 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-catalog-content\") pod \"f9f3a288-407a-47bb-966e-ff98080b9123\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " Oct 02 11:32:00 crc kubenswrapper[4835]: I1002 11:32:00.940317 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh6wq\" (UniqueName: \"kubernetes.io/projected/f9f3a288-407a-47bb-966e-ff98080b9123-kube-api-access-kh6wq\") pod \"f9f3a288-407a-47bb-966e-ff98080b9123\" (UID: \"f9f3a288-407a-47bb-966e-ff98080b9123\") " Oct 02 11:32:00 crc kubenswrapper[4835]: I1002 11:32:00.941257 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-utilities" (OuterVolumeSpecName: "utilities") pod "f9f3a288-407a-47bb-966e-ff98080b9123" (UID: "f9f3a288-407a-47bb-966e-ff98080b9123"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:00 crc kubenswrapper[4835]: I1002 11:32:00.945405 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f3a288-407a-47bb-966e-ff98080b9123-kube-api-access-kh6wq" (OuterVolumeSpecName: "kube-api-access-kh6wq") pod "f9f3a288-407a-47bb-966e-ff98080b9123" (UID: "f9f3a288-407a-47bb-966e-ff98080b9123"). InnerVolumeSpecName "kube-api-access-kh6wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:00 crc kubenswrapper[4835]: I1002 11:32:00.955499 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9f3a288-407a-47bb-966e-ff98080b9123" (UID: "f9f3a288-407a-47bb-966e-ff98080b9123"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:01 crc kubenswrapper[4835]: I1002 11:32:01.044084 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:01 crc kubenswrapper[4835]: I1002 11:32:01.044120 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f3a288-407a-47bb-966e-ff98080b9123-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:01 crc kubenswrapper[4835]: I1002 11:32:01.044132 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh6wq\" (UniqueName: \"kubernetes.io/projected/f9f3a288-407a-47bb-966e-ff98080b9123-kube-api-access-kh6wq\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:01 crc kubenswrapper[4835]: I1002 11:32:01.830120 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lr5fw" Oct 02 11:32:01 crc kubenswrapper[4835]: I1002 11:32:01.861923 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lr5fw"] Oct 02 11:32:01 crc kubenswrapper[4835]: I1002 11:32:01.869482 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lr5fw"] Oct 02 11:32:02 crc kubenswrapper[4835]: I1002 11:32:02.262424 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f3a288-407a-47bb-966e-ff98080b9123" path="/var/lib/kubelet/pods/f9f3a288-407a-47bb-966e-ff98080b9123/volumes" Oct 02 11:32:06 crc kubenswrapper[4835]: I1002 11:32:06.943058 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r"] Oct 02 11:32:06 crc kubenswrapper[4835]: I1002 11:32:06.952094 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm"] Oct 02 11:32:06 crc kubenswrapper[4835]: I1002 11:32:06.961306 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-brx4r"] Oct 02 11:32:06 crc kubenswrapper[4835]: I1002 11:32:06.969046 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4"] Oct 02 11:32:06 crc kubenswrapper[4835]: I1002 11:32:06.975338 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw"] Oct 02 11:32:06 crc kubenswrapper[4835]: I1002 11:32:06.981161 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59"] Oct 02 11:32:06 crc kubenswrapper[4835]: I1002 11:32:06.987335 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9"] Oct 02 11:32:06 crc kubenswrapper[4835]: I1002 11:32:06.996645 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.002256 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.008491 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7b6zm"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.014586 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.020714 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.027560 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hb4dw"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.034105 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pvrc4"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.039154 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xj2mt"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.044027 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bfcz"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.050374 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wjhs2"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.057207 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8j66r"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.063205 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bjfsx"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.068541 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p6r59"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.074164 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wjhs2"] Oct 02 11:32:07 crc kubenswrapper[4835]: I1002 11:32:07.080285 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mfld9"] Oct 02 11:32:08 crc kubenswrapper[4835]: I1002 11:32:08.263637 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be9db5e-b163-4f34-9030-c565b5da60ca" path="/var/lib/kubelet/pods/0be9db5e-b163-4f34-9030-c565b5da60ca/volumes" Oct 02 11:32:08 crc kubenswrapper[4835]: I1002 11:32:08.264382 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed489f8-6c30-403d-8634-2c67229b4114" path="/var/lib/kubelet/pods/2ed489f8-6c30-403d-8634-2c67229b4114/volumes" Oct 02 11:32:08 crc kubenswrapper[4835]: I1002 11:32:08.264986 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3acaf2eb-6424-41a4-94d0-51edaa41fb24" path="/var/lib/kubelet/pods/3acaf2eb-6424-41a4-94d0-51edaa41fb24/volumes" Oct 02 11:32:08 crc kubenswrapper[4835]: I1002 11:32:08.265610 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5292edd5-df1a-4ab0-a2cc-0cf5846f60f8" path="/var/lib/kubelet/pods/5292edd5-df1a-4ab0-a2cc-0cf5846f60f8/volumes" Oct 02 11:32:08 crc kubenswrapper[4835]: I1002 11:32:08.266883 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e8b9d7-69d4-4cc0-8209-5df62efe58a4" path="/var/lib/kubelet/pods/55e8b9d7-69d4-4cc0-8209-5df62efe58a4/volumes" Oct 02 11:32:08 crc kubenswrapper[4835]: I1002 11:32:08.267515 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580f5f3f-c372-4b1e-8806-8e3ab55cb8ce" path="/var/lib/kubelet/pods/580f5f3f-c372-4b1e-8806-8e3ab55cb8ce/volumes" Oct 02 11:32:08 crc kubenswrapper[4835]: I1002 11:32:08.268102 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62f12acc-2a68-4f9f-bde2-c223c102bf2a" path="/var/lib/kubelet/pods/62f12acc-2a68-4f9f-bde2-c223c102bf2a/volumes" Oct 02 11:32:08 crc kubenswrapper[4835]: I1002 11:32:08.269142 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71be51be-d812-42b5-9579-e6d50e13eeda" path="/var/lib/kubelet/pods/71be51be-d812-42b5-9579-e6d50e13eeda/volumes" Oct 02 11:32:08 crc kubenswrapper[4835]: I1002 11:32:08.269767 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cdc9e59-a224-4be9-9908-3629031c7613" path="/var/lib/kubelet/pods/7cdc9e59-a224-4be9-9908-3629031c7613/volumes" Oct 02 11:32:08 crc kubenswrapper[4835]: I1002 11:32:08.270377 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86e7d26c-be8d-4b84-90b5-58bdaedbfefc" path="/var/lib/kubelet/pods/86e7d26c-be8d-4b84-90b5-58bdaedbfefc/volumes" Oct 02 11:32:08 crc kubenswrapper[4835]: I1002 11:32:08.271455 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88def01a-a300-40c2-bd09-e6c4ac838101" path="/var/lib/kubelet/pods/88def01a-a300-40c2-bd09-e6c4ac838101/volumes" Oct 02 11:32:11 crc kubenswrapper[4835]: I1002 11:32:11.983820 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:32:11 crc kubenswrapper[4835]: I1002 11:32:11.984193 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:32:11 crc kubenswrapper[4835]: I1002 11:32:11.984298 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:32:11 crc kubenswrapper[4835]: I1002 11:32:11.985078 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73a074b76cb9c97dae601976ee29c575b71ca7b26055a59b51af688465758233"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:32:11 crc kubenswrapper[4835]: I1002 11:32:11.985144 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://73a074b76cb9c97dae601976ee29c575b71ca7b26055a59b51af688465758233" gracePeriod=600 Oct 02 11:32:12 crc kubenswrapper[4835]: I1002 11:32:12.944662 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="73a074b76cb9c97dae601976ee29c575b71ca7b26055a59b51af688465758233" exitCode=0 Oct 02 11:32:12 crc kubenswrapper[4835]: I1002 11:32:12.944973 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"73a074b76cb9c97dae601976ee29c575b71ca7b26055a59b51af688465758233"} Oct 02 11:32:12 crc kubenswrapper[4835]: I1002 11:32:12.945006 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3"} Oct 02 11:32:12 crc kubenswrapper[4835]: I1002 11:32:12.945025 4835 scope.go:117] "RemoveContainer" containerID="59011f223dac7029a698303fa6e938b7b32fe76cf15e6de825fe00063b7c5a5a" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.215029 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p"] Oct 02 11:32:13 crc kubenswrapper[4835]: E1002 11:32:13.215736 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f3a288-407a-47bb-966e-ff98080b9123" containerName="extract-content" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.215752 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f3a288-407a-47bb-966e-ff98080b9123" containerName="extract-content" Oct 02 11:32:13 crc kubenswrapper[4835]: E1002 11:32:13.215780 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f3a288-407a-47bb-966e-ff98080b9123" containerName="registry-server" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.215787 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f3a288-407a-47bb-966e-ff98080b9123" containerName="registry-server" Oct 02 11:32:13 crc kubenswrapper[4835]: E1002 11:32:13.215801 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f3a288-407a-47bb-966e-ff98080b9123" containerName="extract-utilities" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.215810 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f3a288-407a-47bb-966e-ff98080b9123" containerName="extract-utilities" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.215987 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f3a288-407a-47bb-966e-ff98080b9123" containerName="registry-server" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.216814 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.223872 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.224213 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.224379 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.224553 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.224899 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.231280 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p"] Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.270705 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.270817 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57986\" (UniqueName: \"kubernetes.io/projected/979ea118-1fa5-4dab-838b-8fbd15307fbc-kube-api-access-57986\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.271006 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.271199 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.271491 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.374295 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.374393 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.374488 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.374525 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57986\" (UniqueName: \"kubernetes.io/projected/979ea118-1fa5-4dab-838b-8fbd15307fbc-kube-api-access-57986\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.374599 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.382165 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.388799 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.388943 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.389828 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.403057 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57986\" (UniqueName: \"kubernetes.io/projected/979ea118-1fa5-4dab-838b-8fbd15307fbc-kube-api-access-57986\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:13 crc kubenswrapper[4835]: I1002 11:32:13.540017 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:14 crc kubenswrapper[4835]: I1002 11:32:14.049085 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p"] Oct 02 11:32:14 crc kubenswrapper[4835]: W1002 11:32:14.062702 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod979ea118_1fa5_4dab_838b_8fbd15307fbc.slice/crio-04c7d49efb2f15af8950e96d19c6338c2751bda51485af8895a9e811adb4b68d WatchSource:0}: Error finding container 04c7d49efb2f15af8950e96d19c6338c2751bda51485af8895a9e811adb4b68d: Status 404 returned error can't find the container with id 04c7d49efb2f15af8950e96d19c6338c2751bda51485af8895a9e811adb4b68d Oct 02 11:32:14 crc kubenswrapper[4835]: I1002 11:32:14.968798 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" event={"ID":"979ea118-1fa5-4dab-838b-8fbd15307fbc","Type":"ContainerStarted","Data":"d35426d1625dc9a8c7b687bf9c3c8279fb0a432de4747d63d242615241494ff0"} Oct 02 11:32:14 crc kubenswrapper[4835]: I1002 11:32:14.969419 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" event={"ID":"979ea118-1fa5-4dab-838b-8fbd15307fbc","Type":"ContainerStarted","Data":"04c7d49efb2f15af8950e96d19c6338c2751bda51485af8895a9e811adb4b68d"} Oct 02 11:32:15 crc kubenswrapper[4835]: I1002 11:32:15.004144 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" podStartSLOduration=1.610313562 podStartE2EDuration="2.004119102s" podCreationTimestamp="2025-10-02 11:32:13 +0000 UTC" firstStartedPulling="2025-10-02 11:32:14.065525699 +0000 UTC m=+2210.625433280" lastFinishedPulling="2025-10-02 11:32:14.459331229 +0000 UTC m=+2211.019238820" observedRunningTime="2025-10-02 11:32:14.986364236 +0000 UTC m=+2211.546271827" watchObservedRunningTime="2025-10-02 11:32:15.004119102 +0000 UTC m=+2211.564026683" Oct 02 11:32:27 crc kubenswrapper[4835]: I1002 11:32:27.070070 4835 generic.go:334] "Generic (PLEG): container finished" podID="979ea118-1fa5-4dab-838b-8fbd15307fbc" containerID="d35426d1625dc9a8c7b687bf9c3c8279fb0a432de4747d63d242615241494ff0" exitCode=0 Oct 02 11:32:27 crc kubenswrapper[4835]: I1002 11:32:27.070119 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" event={"ID":"979ea118-1fa5-4dab-838b-8fbd15307fbc","Type":"ContainerDied","Data":"d35426d1625dc9a8c7b687bf9c3c8279fb0a432de4747d63d242615241494ff0"} Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.453053 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.584784 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57986\" (UniqueName: \"kubernetes.io/projected/979ea118-1fa5-4dab-838b-8fbd15307fbc-kube-api-access-57986\") pod \"979ea118-1fa5-4dab-838b-8fbd15307fbc\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.584960 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ssh-key\") pod \"979ea118-1fa5-4dab-838b-8fbd15307fbc\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.584998 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-inventory\") pod \"979ea118-1fa5-4dab-838b-8fbd15307fbc\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.585128 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ceph\") pod \"979ea118-1fa5-4dab-838b-8fbd15307fbc\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.585190 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-repo-setup-combined-ca-bundle\") pod \"979ea118-1fa5-4dab-838b-8fbd15307fbc\" (UID: \"979ea118-1fa5-4dab-838b-8fbd15307fbc\") " Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.598455 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ceph" (OuterVolumeSpecName: "ceph") pod "979ea118-1fa5-4dab-838b-8fbd15307fbc" (UID: "979ea118-1fa5-4dab-838b-8fbd15307fbc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.600393 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "979ea118-1fa5-4dab-838b-8fbd15307fbc" (UID: "979ea118-1fa5-4dab-838b-8fbd15307fbc"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.612408 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/979ea118-1fa5-4dab-838b-8fbd15307fbc-kube-api-access-57986" (OuterVolumeSpecName: "kube-api-access-57986") pod "979ea118-1fa5-4dab-838b-8fbd15307fbc" (UID: "979ea118-1fa5-4dab-838b-8fbd15307fbc"). InnerVolumeSpecName "kube-api-access-57986". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.633426 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "979ea118-1fa5-4dab-838b-8fbd15307fbc" (UID: "979ea118-1fa5-4dab-838b-8fbd15307fbc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.657023 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-inventory" (OuterVolumeSpecName: "inventory") pod "979ea118-1fa5-4dab-838b-8fbd15307fbc" (UID: "979ea118-1fa5-4dab-838b-8fbd15307fbc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.687527 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.687734 4835 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.687825 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57986\" (UniqueName: \"kubernetes.io/projected/979ea118-1fa5-4dab-838b-8fbd15307fbc-kube-api-access-57986\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.687882 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:28 crc kubenswrapper[4835]: I1002 11:32:28.687935 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979ea118-1fa5-4dab-838b-8fbd15307fbc-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.087337 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" event={"ID":"979ea118-1fa5-4dab-838b-8fbd15307fbc","Type":"ContainerDied","Data":"04c7d49efb2f15af8950e96d19c6338c2751bda51485af8895a9e811adb4b68d"} Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.087559 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c7d49efb2f15af8950e96d19c6338c2751bda51485af8895a9e811adb4b68d" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.087393 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.151442 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9"] Oct 02 11:32:29 crc kubenswrapper[4835]: E1002 11:32:29.151854 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="979ea118-1fa5-4dab-838b-8fbd15307fbc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.151877 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="979ea118-1fa5-4dab-838b-8fbd15307fbc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.152132 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="979ea118-1fa5-4dab-838b-8fbd15307fbc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.152846 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.155153 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.155366 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.155571 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.155845 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.156861 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.160597 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9"] Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.296687 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.296748 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.296789 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.296817 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnfjv\" (UniqueName: \"kubernetes.io/projected/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-kube-api-access-fnfjv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.296925 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.401239 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.401774 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.401824 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.401851 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnfjv\" (UniqueName: \"kubernetes.io/projected/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-kube-api-access-fnfjv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.402665 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.409847 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.410052 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.410371 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.417766 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.419247 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnfjv\" (UniqueName: \"kubernetes.io/projected/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-kube-api-access-fnfjv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.467549 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:32:29 crc kubenswrapper[4835]: I1002 11:32:29.947507 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9"] Oct 02 11:32:29 crc kubenswrapper[4835]: W1002 11:32:29.953150 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9ec4746_21a6_4eb2_bbe4_b929315a91c5.slice/crio-c1d30862b53da7566ad66c63c95b853a1e1317e5208cb6684ae26014a7dfab3f WatchSource:0}: Error finding container c1d30862b53da7566ad66c63c95b853a1e1317e5208cb6684ae26014a7dfab3f: Status 404 returned error can't find the container with id c1d30862b53da7566ad66c63c95b853a1e1317e5208cb6684ae26014a7dfab3f Oct 02 11:32:30 crc kubenswrapper[4835]: I1002 11:32:30.099716 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" event={"ID":"a9ec4746-21a6-4eb2-bbe4-b929315a91c5","Type":"ContainerStarted","Data":"c1d30862b53da7566ad66c63c95b853a1e1317e5208cb6684ae26014a7dfab3f"} Oct 02 11:32:31 crc kubenswrapper[4835]: I1002 11:32:31.108125 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" event={"ID":"a9ec4746-21a6-4eb2-bbe4-b929315a91c5","Type":"ContainerStarted","Data":"41aa587174a6f0db79b7587020a49da64af582c313c95c7cdbb81f916caff5ae"} Oct 02 11:32:31 crc kubenswrapper[4835]: I1002 11:32:31.131499 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" podStartSLOduration=1.370033703 podStartE2EDuration="2.131482021s" podCreationTimestamp="2025-10-02 11:32:29 +0000 UTC" firstStartedPulling="2025-10-02 11:32:29.955463699 +0000 UTC m=+2226.515371280" lastFinishedPulling="2025-10-02 11:32:30.716912017 +0000 UTC m=+2227.276819598" observedRunningTime="2025-10-02 11:32:31.123168309 +0000 UTC m=+2227.683075900" watchObservedRunningTime="2025-10-02 11:32:31.131482021 +0000 UTC m=+2227.691389602" Oct 02 11:32:59 crc kubenswrapper[4835]: I1002 11:32:59.013521 4835 scope.go:117] "RemoveContainer" containerID="0f643c2ba99f6d78899f4ae4441e5bdc63aa96293f114905e0fa054d3147b5e5" Oct 02 11:32:59 crc kubenswrapper[4835]: I1002 11:32:59.069791 4835 scope.go:117] "RemoveContainer" containerID="405caea66cbf92d42f0762d9c583403142c761591f239825d7ba292c57ca7edf" Oct 02 11:32:59 crc kubenswrapper[4835]: I1002 11:32:59.124442 4835 scope.go:117] "RemoveContainer" containerID="56c23e2bc1ef56e179b3b33e5c086edb500c25c90889b5e24fd64574e01a47f3" Oct 02 11:32:59 crc kubenswrapper[4835]: I1002 11:32:59.165597 4835 scope.go:117] "RemoveContainer" containerID="5de9a334b11cbf9d8419906fd6d893e7e8ef0f220847f058c2dce429ac45bb89" Oct 02 11:32:59 crc kubenswrapper[4835]: I1002 11:32:59.223928 4835 scope.go:117] "RemoveContainer" containerID="1eb230f23a7ec5033317bf72a75b47e609265a7a840e6ec77ad4f14f41b05d03" Oct 02 11:32:59 crc kubenswrapper[4835]: I1002 11:32:59.247282 4835 scope.go:117] "RemoveContainer" containerID="275aad0297460f7c2885f2d92f56717591343c14a05d5ccc4fa6d32bfa99c4ee" Oct 02 11:32:59 crc kubenswrapper[4835]: I1002 11:32:59.290150 4835 scope.go:117] "RemoveContainer" containerID="6f492035d9d9991cbe6d8eca9baaedb65aba3144732de85280f28a3da60d82da" Oct 02 11:32:59 crc kubenswrapper[4835]: I1002 11:32:59.319790 4835 scope.go:117] "RemoveContainer" containerID="acac1ce3832e8a4ea0cff66ecd507902e9ecd7896984dfd99b4dc495c33fa7de" Oct 02 11:33:59 crc kubenswrapper[4835]: I1002 11:33:59.491617 4835 scope.go:117] "RemoveContainer" containerID="f45aa9349088b7fcf2497145f2faa3534f5edaa711bc059dd5fc7f663f033d63" Oct 02 11:33:59 crc kubenswrapper[4835]: I1002 11:33:59.524063 4835 scope.go:117] "RemoveContainer" containerID="83469bf6d2c0c4eb8d12decd85fc17959a9e3a98b90939a664ed22c7af41b69d" Oct 02 11:33:59 crc kubenswrapper[4835]: I1002 11:33:59.564374 4835 scope.go:117] "RemoveContainer" containerID="8764b4549cf14b981b55a4ddc32cb40abb407444fe61fa98133d231ee6bdf7eb" Oct 02 11:34:03 crc kubenswrapper[4835]: I1002 11:34:03.855572 4835 generic.go:334] "Generic (PLEG): container finished" podID="a9ec4746-21a6-4eb2-bbe4-b929315a91c5" containerID="41aa587174a6f0db79b7587020a49da64af582c313c95c7cdbb81f916caff5ae" exitCode=0 Oct 02 11:34:03 crc kubenswrapper[4835]: I1002 11:34:03.856085 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" event={"ID":"a9ec4746-21a6-4eb2-bbe4-b929315a91c5","Type":"ContainerDied","Data":"41aa587174a6f0db79b7587020a49da64af582c313c95c7cdbb81f916caff5ae"} Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.247420 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.414261 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-inventory\") pod \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.414335 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ssh-key\") pod \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.414497 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ceph\") pod \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.414577 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-bootstrap-combined-ca-bundle\") pod \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.414670 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnfjv\" (UniqueName: \"kubernetes.io/projected/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-kube-api-access-fnfjv\") pod \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.420087 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a9ec4746-21a6-4eb2-bbe4-b929315a91c5" (UID: "a9ec4746-21a6-4eb2-bbe4-b929315a91c5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.420111 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ceph" (OuterVolumeSpecName: "ceph") pod "a9ec4746-21a6-4eb2-bbe4-b929315a91c5" (UID: "a9ec4746-21a6-4eb2-bbe4-b929315a91c5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.420988 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-kube-api-access-fnfjv" (OuterVolumeSpecName: "kube-api-access-fnfjv") pod "a9ec4746-21a6-4eb2-bbe4-b929315a91c5" (UID: "a9ec4746-21a6-4eb2-bbe4-b929315a91c5"). InnerVolumeSpecName "kube-api-access-fnfjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:05 crc kubenswrapper[4835]: E1002 11:34:05.439511 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ssh-key podName:a9ec4746-21a6-4eb2-bbe4-b929315a91c5 nodeName:}" failed. No retries permitted until 2025-10-02 11:34:05.939484219 +0000 UTC m=+2322.499391800 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ssh-key") pod "a9ec4746-21a6-4eb2-bbe4-b929315a91c5" (UID: "a9ec4746-21a6-4eb2-bbe4-b929315a91c5") : error deleting /var/lib/kubelet/pods/a9ec4746-21a6-4eb2-bbe4-b929315a91c5/volume-subpaths: remove /var/lib/kubelet/pods/a9ec4746-21a6-4eb2-bbe4-b929315a91c5/volume-subpaths: no such file or directory Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.441871 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-inventory" (OuterVolumeSpecName: "inventory") pod "a9ec4746-21a6-4eb2-bbe4-b929315a91c5" (UID: "a9ec4746-21a6-4eb2-bbe4-b929315a91c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.516373 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnfjv\" (UniqueName: \"kubernetes.io/projected/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-kube-api-access-fnfjv\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.516402 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.516411 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.516420 4835 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.873190 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" event={"ID":"a9ec4746-21a6-4eb2-bbe4-b929315a91c5","Type":"ContainerDied","Data":"c1d30862b53da7566ad66c63c95b853a1e1317e5208cb6684ae26014a7dfab3f"} Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.873245 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.873257 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1d30862b53da7566ad66c63c95b853a1e1317e5208cb6684ae26014a7dfab3f" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.954685 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf"] Oct 02 11:34:05 crc kubenswrapper[4835]: E1002 11:34:05.955085 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ec4746-21a6-4eb2-bbe4-b929315a91c5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.955112 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ec4746-21a6-4eb2-bbe4-b929315a91c5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.955416 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ec4746-21a6-4eb2-bbe4-b929315a91c5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.956126 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:05 crc kubenswrapper[4835]: I1002 11:34:05.964799 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf"] Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.023140 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ssh-key\") pod \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\" (UID: \"a9ec4746-21a6-4eb2-bbe4-b929315a91c5\") " Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.034482 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a9ec4746-21a6-4eb2-bbe4-b929315a91c5" (UID: "a9ec4746-21a6-4eb2-bbe4-b929315a91c5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.125410 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.125470 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.125535 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.125612 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws4wf\" (UniqueName: \"kubernetes.io/projected/d830ff32-e5f8-46b0-ba9c-988561d11e8c-kube-api-access-ws4wf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.125669 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9ec4746-21a6-4eb2-bbe4-b929315a91c5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.226658 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws4wf\" (UniqueName: \"kubernetes.io/projected/d830ff32-e5f8-46b0-ba9c-988561d11e8c-kube-api-access-ws4wf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.226727 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.226779 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.226865 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.230510 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.232886 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.232964 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.243767 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws4wf\" (UniqueName: \"kubernetes.io/projected/d830ff32-e5f8-46b0-ba9c-988561d11e8c-kube-api-access-ws4wf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-844jf\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.278767 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.795450 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf"] Oct 02 11:34:06 crc kubenswrapper[4835]: I1002 11:34:06.881433 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" event={"ID":"d830ff32-e5f8-46b0-ba9c-988561d11e8c","Type":"ContainerStarted","Data":"e321bab70f84e0fa73fb531a54b521f9e2b7f2a03a7193e3cb92b01ecd9d8ee6"} Oct 02 11:34:07 crc kubenswrapper[4835]: I1002 11:34:07.868463 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qxmqm"] Oct 02 11:34:07 crc kubenswrapper[4835]: I1002 11:34:07.870734 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:07 crc kubenswrapper[4835]: I1002 11:34:07.879709 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxmqm"] Oct 02 11:34:07 crc kubenswrapper[4835]: I1002 11:34:07.894301 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" event={"ID":"d830ff32-e5f8-46b0-ba9c-988561d11e8c","Type":"ContainerStarted","Data":"e20b3b5c1626ab23cc0e9b90407eebb7100d8818030555c284f5defc51a7b6ba"} Oct 02 11:34:07 crc kubenswrapper[4835]: I1002 11:34:07.916092 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" podStartSLOduration=2.409234816 podStartE2EDuration="2.916073177s" podCreationTimestamp="2025-10-02 11:34:05 +0000 UTC" firstStartedPulling="2025-10-02 11:34:06.800856163 +0000 UTC m=+2323.360763744" lastFinishedPulling="2025-10-02 11:34:07.307694534 +0000 UTC m=+2323.867602105" observedRunningTime="2025-10-02 11:34:07.910358903 +0000 UTC m=+2324.470266484" watchObservedRunningTime="2025-10-02 11:34:07.916073177 +0000 UTC m=+2324.475980758" Oct 02 11:34:07 crc kubenswrapper[4835]: I1002 11:34:07.959705 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-catalog-content\") pod \"community-operators-qxmqm\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:07 crc kubenswrapper[4835]: I1002 11:34:07.959787 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85j9b\" (UniqueName: \"kubernetes.io/projected/b91ac39d-201b-496c-bda2-cd1d917d63a8-kube-api-access-85j9b\") pod \"community-operators-qxmqm\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:07 crc kubenswrapper[4835]: I1002 11:34:07.959884 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-utilities\") pod \"community-operators-qxmqm\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:08 crc kubenswrapper[4835]: I1002 11:34:08.061123 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-catalog-content\") pod \"community-operators-qxmqm\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:08 crc kubenswrapper[4835]: I1002 11:34:08.061465 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85j9b\" (UniqueName: \"kubernetes.io/projected/b91ac39d-201b-496c-bda2-cd1d917d63a8-kube-api-access-85j9b\") pod \"community-operators-qxmqm\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:08 crc kubenswrapper[4835]: I1002 11:34:08.061687 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-utilities\") pod \"community-operators-qxmqm\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:08 crc kubenswrapper[4835]: I1002 11:34:08.061816 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-catalog-content\") pod \"community-operators-qxmqm\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:08 crc kubenswrapper[4835]: I1002 11:34:08.062331 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-utilities\") pod \"community-operators-qxmqm\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:08 crc kubenswrapper[4835]: I1002 11:34:08.082021 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85j9b\" (UniqueName: \"kubernetes.io/projected/b91ac39d-201b-496c-bda2-cd1d917d63a8-kube-api-access-85j9b\") pod \"community-operators-qxmqm\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:08 crc kubenswrapper[4835]: I1002 11:34:08.188042 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:08 crc kubenswrapper[4835]: I1002 11:34:08.700606 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxmqm"] Oct 02 11:34:08 crc kubenswrapper[4835]: W1002 11:34:08.708343 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb91ac39d_201b_496c_bda2_cd1d917d63a8.slice/crio-2371ed7f4b7147bbd67b01e93583c29af153a67c4e89b29ea9e01ffc6a8a7907 WatchSource:0}: Error finding container 2371ed7f4b7147bbd67b01e93583c29af153a67c4e89b29ea9e01ffc6a8a7907: Status 404 returned error can't find the container with id 2371ed7f4b7147bbd67b01e93583c29af153a67c4e89b29ea9e01ffc6a8a7907 Oct 02 11:34:08 crc kubenswrapper[4835]: I1002 11:34:08.903736 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxmqm" event={"ID":"b91ac39d-201b-496c-bda2-cd1d917d63a8","Type":"ContainerStarted","Data":"96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01"} Oct 02 11:34:08 crc kubenswrapper[4835]: I1002 11:34:08.904085 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxmqm" event={"ID":"b91ac39d-201b-496c-bda2-cd1d917d63a8","Type":"ContainerStarted","Data":"2371ed7f4b7147bbd67b01e93583c29af153a67c4e89b29ea9e01ffc6a8a7907"} Oct 02 11:34:09 crc kubenswrapper[4835]: I1002 11:34:09.912494 4835 generic.go:334] "Generic (PLEG): container finished" podID="b91ac39d-201b-496c-bda2-cd1d917d63a8" containerID="96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01" exitCode=0 Oct 02 11:34:09 crc kubenswrapper[4835]: I1002 11:34:09.912753 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxmqm" event={"ID":"b91ac39d-201b-496c-bda2-cd1d917d63a8","Type":"ContainerDied","Data":"96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01"} Oct 02 11:34:10 crc kubenswrapper[4835]: I1002 11:34:10.921884 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxmqm" event={"ID":"b91ac39d-201b-496c-bda2-cd1d917d63a8","Type":"ContainerStarted","Data":"fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c"} Oct 02 11:34:11 crc kubenswrapper[4835]: I1002 11:34:11.932407 4835 generic.go:334] "Generic (PLEG): container finished" podID="b91ac39d-201b-496c-bda2-cd1d917d63a8" containerID="fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c" exitCode=0 Oct 02 11:34:11 crc kubenswrapper[4835]: I1002 11:34:11.932456 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxmqm" event={"ID":"b91ac39d-201b-496c-bda2-cd1d917d63a8","Type":"ContainerDied","Data":"fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c"} Oct 02 11:34:13 crc kubenswrapper[4835]: I1002 11:34:13.949928 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxmqm" event={"ID":"b91ac39d-201b-496c-bda2-cd1d917d63a8","Type":"ContainerStarted","Data":"83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84"} Oct 02 11:34:13 crc kubenswrapper[4835]: I1002 11:34:13.979405 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qxmqm" podStartSLOduration=3.769958067 podStartE2EDuration="6.979384369s" podCreationTimestamp="2025-10-02 11:34:07 +0000 UTC" firstStartedPulling="2025-10-02 11:34:09.914810486 +0000 UTC m=+2326.474718067" lastFinishedPulling="2025-10-02 11:34:13.124236788 +0000 UTC m=+2329.684144369" observedRunningTime="2025-10-02 11:34:13.969004751 +0000 UTC m=+2330.528912342" watchObservedRunningTime="2025-10-02 11:34:13.979384369 +0000 UTC m=+2330.539291950" Oct 02 11:34:18 crc kubenswrapper[4835]: I1002 11:34:18.188984 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:18 crc kubenswrapper[4835]: I1002 11:34:18.189315 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:18 crc kubenswrapper[4835]: I1002 11:34:18.236614 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:19 crc kubenswrapper[4835]: I1002 11:34:19.056335 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:19 crc kubenswrapper[4835]: I1002 11:34:19.117932 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qxmqm"] Oct 02 11:34:20 crc kubenswrapper[4835]: I1002 11:34:20.999551 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qxmqm" podUID="b91ac39d-201b-496c-bda2-cd1d917d63a8" containerName="registry-server" containerID="cri-o://83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84" gracePeriod=2 Oct 02 11:34:21 crc kubenswrapper[4835]: I1002 11:34:21.426300 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:21 crc kubenswrapper[4835]: I1002 11:34:21.511648 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-catalog-content\") pod \"b91ac39d-201b-496c-bda2-cd1d917d63a8\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " Oct 02 11:34:21 crc kubenswrapper[4835]: I1002 11:34:21.511694 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-utilities\") pod \"b91ac39d-201b-496c-bda2-cd1d917d63a8\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " Oct 02 11:34:21 crc kubenswrapper[4835]: I1002 11:34:21.511766 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85j9b\" (UniqueName: \"kubernetes.io/projected/b91ac39d-201b-496c-bda2-cd1d917d63a8-kube-api-access-85j9b\") pod \"b91ac39d-201b-496c-bda2-cd1d917d63a8\" (UID: \"b91ac39d-201b-496c-bda2-cd1d917d63a8\") " Oct 02 11:34:21 crc kubenswrapper[4835]: I1002 11:34:21.512501 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-utilities" (OuterVolumeSpecName: "utilities") pod "b91ac39d-201b-496c-bda2-cd1d917d63a8" (UID: "b91ac39d-201b-496c-bda2-cd1d917d63a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:21 crc kubenswrapper[4835]: I1002 11:34:21.517087 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91ac39d-201b-496c-bda2-cd1d917d63a8-kube-api-access-85j9b" (OuterVolumeSpecName: "kube-api-access-85j9b") pod "b91ac39d-201b-496c-bda2-cd1d917d63a8" (UID: "b91ac39d-201b-496c-bda2-cd1d917d63a8"). InnerVolumeSpecName "kube-api-access-85j9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:21 crc kubenswrapper[4835]: I1002 11:34:21.614095 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85j9b\" (UniqueName: \"kubernetes.io/projected/b91ac39d-201b-496c-bda2-cd1d917d63a8-kube-api-access-85j9b\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:21 crc kubenswrapper[4835]: I1002 11:34:21.614132 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:21 crc kubenswrapper[4835]: I1002 11:34:21.663612 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b91ac39d-201b-496c-bda2-cd1d917d63a8" (UID: "b91ac39d-201b-496c-bda2-cd1d917d63a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:21 crc kubenswrapper[4835]: I1002 11:34:21.716270 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b91ac39d-201b-496c-bda2-cd1d917d63a8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.009528 4835 generic.go:334] "Generic (PLEG): container finished" podID="b91ac39d-201b-496c-bda2-cd1d917d63a8" containerID="83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84" exitCode=0 Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.009583 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxmqm" event={"ID":"b91ac39d-201b-496c-bda2-cd1d917d63a8","Type":"ContainerDied","Data":"83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84"} Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.009598 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxmqm" Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.009660 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxmqm" event={"ID":"b91ac39d-201b-496c-bda2-cd1d917d63a8","Type":"ContainerDied","Data":"2371ed7f4b7147bbd67b01e93583c29af153a67c4e89b29ea9e01ffc6a8a7907"} Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.009709 4835 scope.go:117] "RemoveContainer" containerID="83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84" Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.041771 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qxmqm"] Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.041975 4835 scope.go:117] "RemoveContainer" containerID="fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c" Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.048532 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qxmqm"] Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.071628 4835 scope.go:117] "RemoveContainer" containerID="96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01" Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.095689 4835 scope.go:117] "RemoveContainer" containerID="83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84" Oct 02 11:34:22 crc kubenswrapper[4835]: E1002 11:34:22.096119 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84\": container with ID starting with 83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84 not found: ID does not exist" containerID="83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84" Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.096178 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84"} err="failed to get container status \"83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84\": rpc error: code = NotFound desc = could not find container \"83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84\": container with ID starting with 83322805c926455d13b28b0f88df5129269ba72eeea98cef1d305b371795cd84 not found: ID does not exist" Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.096199 4835 scope.go:117] "RemoveContainer" containerID="fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c" Oct 02 11:34:22 crc kubenswrapper[4835]: E1002 11:34:22.096422 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c\": container with ID starting with fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c not found: ID does not exist" containerID="fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c" Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.096459 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c"} err="failed to get container status \"fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c\": rpc error: code = NotFound desc = could not find container \"fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c\": container with ID starting with fa6a6b1c673d2e41c2434bf81531818b0e3c0849712fab5c18b56fffab940a6c not found: ID does not exist" Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.096472 4835 scope.go:117] "RemoveContainer" containerID="96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01" Oct 02 11:34:22 crc kubenswrapper[4835]: E1002 11:34:22.096620 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01\": container with ID starting with 96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01 not found: ID does not exist" containerID="96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01" Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.096639 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01"} err="failed to get container status \"96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01\": rpc error: code = NotFound desc = could not find container \"96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01\": container with ID starting with 96374a5d158d46c9b0b1ce07c6f221def9757fb55355a250b6e49122b7355f01 not found: ID does not exist" Oct 02 11:34:22 crc kubenswrapper[4835]: I1002 11:34:22.262685 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91ac39d-201b-496c-bda2-cd1d917d63a8" path="/var/lib/kubelet/pods/b91ac39d-201b-496c-bda2-cd1d917d63a8/volumes" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.619145 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-27mn6"] Oct 02 11:34:29 crc kubenswrapper[4835]: E1002 11:34:29.621015 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91ac39d-201b-496c-bda2-cd1d917d63a8" containerName="extract-utilities" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.621125 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91ac39d-201b-496c-bda2-cd1d917d63a8" containerName="extract-utilities" Oct 02 11:34:29 crc kubenswrapper[4835]: E1002 11:34:29.621219 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91ac39d-201b-496c-bda2-cd1d917d63a8" containerName="extract-content" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.621353 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91ac39d-201b-496c-bda2-cd1d917d63a8" containerName="extract-content" Oct 02 11:34:29 crc kubenswrapper[4835]: E1002 11:34:29.621421 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91ac39d-201b-496c-bda2-cd1d917d63a8" containerName="registry-server" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.621471 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91ac39d-201b-496c-bda2-cd1d917d63a8" containerName="registry-server" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.621679 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b91ac39d-201b-496c-bda2-cd1d917d63a8" containerName="registry-server" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.623184 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.633863 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-27mn6"] Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.766144 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7tl8\" (UniqueName: \"kubernetes.io/projected/a5a1543b-05c0-41de-b492-c4a266793d2b-kube-api-access-f7tl8\") pod \"redhat-operators-27mn6\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.766299 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-utilities\") pod \"redhat-operators-27mn6\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.766323 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-catalog-content\") pod \"redhat-operators-27mn6\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.867730 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7tl8\" (UniqueName: \"kubernetes.io/projected/a5a1543b-05c0-41de-b492-c4a266793d2b-kube-api-access-f7tl8\") pod \"redhat-operators-27mn6\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.867930 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-utilities\") pod \"redhat-operators-27mn6\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.867963 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-catalog-content\") pod \"redhat-operators-27mn6\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.868481 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-utilities\") pod \"redhat-operators-27mn6\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.868566 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-catalog-content\") pod \"redhat-operators-27mn6\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.893451 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7tl8\" (UniqueName: \"kubernetes.io/projected/a5a1543b-05c0-41de-b492-c4a266793d2b-kube-api-access-f7tl8\") pod \"redhat-operators-27mn6\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:29 crc kubenswrapper[4835]: I1002 11:34:29.946279 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:30 crc kubenswrapper[4835]: I1002 11:34:30.431110 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-27mn6"] Oct 02 11:34:31 crc kubenswrapper[4835]: I1002 11:34:31.109744 4835 generic.go:334] "Generic (PLEG): container finished" podID="d830ff32-e5f8-46b0-ba9c-988561d11e8c" containerID="e20b3b5c1626ab23cc0e9b90407eebb7100d8818030555c284f5defc51a7b6ba" exitCode=0 Oct 02 11:34:31 crc kubenswrapper[4835]: I1002 11:34:31.109843 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" event={"ID":"d830ff32-e5f8-46b0-ba9c-988561d11e8c","Type":"ContainerDied","Data":"e20b3b5c1626ab23cc0e9b90407eebb7100d8818030555c284f5defc51a7b6ba"} Oct 02 11:34:31 crc kubenswrapper[4835]: I1002 11:34:31.113498 4835 generic.go:334] "Generic (PLEG): container finished" podID="a5a1543b-05c0-41de-b492-c4a266793d2b" containerID="deab834fe891bb95d9b67ab49df9250a629bd5343d260521c2335d7379d36f05" exitCode=0 Oct 02 11:34:31 crc kubenswrapper[4835]: I1002 11:34:31.113560 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27mn6" event={"ID":"a5a1543b-05c0-41de-b492-c4a266793d2b","Type":"ContainerDied","Data":"deab834fe891bb95d9b67ab49df9250a629bd5343d260521c2335d7379d36f05"} Oct 02 11:34:31 crc kubenswrapper[4835]: I1002 11:34:31.113617 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27mn6" event={"ID":"a5a1543b-05c0-41de-b492-c4a266793d2b","Type":"ContainerStarted","Data":"e0edbb1aed2ad152f2d170f610f2bd16a3b78d2c464e2759995072976b1c1b3b"} Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.532167 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.629812 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws4wf\" (UniqueName: \"kubernetes.io/projected/d830ff32-e5f8-46b0-ba9c-988561d11e8c-kube-api-access-ws4wf\") pod \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.629911 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ssh-key\") pod \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.629971 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-inventory\") pod \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.629991 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ceph\") pod \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\" (UID: \"d830ff32-e5f8-46b0-ba9c-988561d11e8c\") " Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.635183 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d830ff32-e5f8-46b0-ba9c-988561d11e8c-kube-api-access-ws4wf" (OuterVolumeSpecName: "kube-api-access-ws4wf") pod "d830ff32-e5f8-46b0-ba9c-988561d11e8c" (UID: "d830ff32-e5f8-46b0-ba9c-988561d11e8c"). InnerVolumeSpecName "kube-api-access-ws4wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.635551 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ceph" (OuterVolumeSpecName: "ceph") pod "d830ff32-e5f8-46b0-ba9c-988561d11e8c" (UID: "d830ff32-e5f8-46b0-ba9c-988561d11e8c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.660983 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d830ff32-e5f8-46b0-ba9c-988561d11e8c" (UID: "d830ff32-e5f8-46b0-ba9c-988561d11e8c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.662398 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-inventory" (OuterVolumeSpecName: "inventory") pod "d830ff32-e5f8-46b0-ba9c-988561d11e8c" (UID: "d830ff32-e5f8-46b0-ba9c-988561d11e8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.732509 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws4wf\" (UniqueName: \"kubernetes.io/projected/d830ff32-e5f8-46b0-ba9c-988561d11e8c-kube-api-access-ws4wf\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.732558 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.732569 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4835]: I1002 11:34:32.732581 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d830ff32-e5f8-46b0-ba9c-988561d11e8c-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.130695 4835 generic.go:334] "Generic (PLEG): container finished" podID="a5a1543b-05c0-41de-b492-c4a266793d2b" containerID="3b28ecce684048c40345876bd68396725e0bed3c8ad59578fd16cb2aea97cbb2" exitCode=0 Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.130770 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27mn6" event={"ID":"a5a1543b-05c0-41de-b492-c4a266793d2b","Type":"ContainerDied","Data":"3b28ecce684048c40345876bd68396725e0bed3c8ad59578fd16cb2aea97cbb2"} Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.132656 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" event={"ID":"d830ff32-e5f8-46b0-ba9c-988561d11e8c","Type":"ContainerDied","Data":"e321bab70f84e0fa73fb531a54b521f9e2b7f2a03a7193e3cb92b01ecd9d8ee6"} Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.132683 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e321bab70f84e0fa73fb531a54b521f9e2b7f2a03a7193e3cb92b01ecd9d8ee6" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.132716 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-844jf" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.201404 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh"] Oct 02 11:34:33 crc kubenswrapper[4835]: E1002 11:34:33.201933 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d830ff32-e5f8-46b0-ba9c-988561d11e8c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.201963 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d830ff32-e5f8-46b0-ba9c-988561d11e8c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.202244 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d830ff32-e5f8-46b0-ba9c-988561d11e8c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.203151 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.206426 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.206760 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.206842 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.206926 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.207095 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.214805 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh"] Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.344977 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.345044 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.345167 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7xr2\" (UniqueName: \"kubernetes.io/projected/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-kube-api-access-q7xr2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.345274 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.447575 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.447653 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.447731 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7xr2\" (UniqueName: \"kubernetes.io/projected/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-kube-api-access-q7xr2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.447763 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.452316 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.452664 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.453519 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.469105 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7xr2\" (UniqueName: \"kubernetes.io/projected/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-kube-api-access-q7xr2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:33 crc kubenswrapper[4835]: I1002 11:34:33.526552 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:34 crc kubenswrapper[4835]: I1002 11:34:34.026778 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh"] Oct 02 11:34:34 crc kubenswrapper[4835]: W1002 11:34:34.027585 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3a1bc51_7cd2_45b7_bb63_0f7af1f913a8.slice/crio-eca14861949d897c7acc40d39f246a6145ef95273ce994275bf86d55e1b5688e WatchSource:0}: Error finding container eca14861949d897c7acc40d39f246a6145ef95273ce994275bf86d55e1b5688e: Status 404 returned error can't find the container with id eca14861949d897c7acc40d39f246a6145ef95273ce994275bf86d55e1b5688e Oct 02 11:34:34 crc kubenswrapper[4835]: I1002 11:34:34.142481 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27mn6" event={"ID":"a5a1543b-05c0-41de-b492-c4a266793d2b","Type":"ContainerStarted","Data":"809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c"} Oct 02 11:34:34 crc kubenswrapper[4835]: I1002 11:34:34.145038 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" event={"ID":"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8","Type":"ContainerStarted","Data":"eca14861949d897c7acc40d39f246a6145ef95273ce994275bf86d55e1b5688e"} Oct 02 11:34:34 crc kubenswrapper[4835]: I1002 11:34:34.166141 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-27mn6" podStartSLOduration=2.5094569939999998 podStartE2EDuration="5.166123877s" podCreationTimestamp="2025-10-02 11:34:29 +0000 UTC" firstStartedPulling="2025-10-02 11:34:31.11789466 +0000 UTC m=+2347.677802261" lastFinishedPulling="2025-10-02 11:34:33.774561563 +0000 UTC m=+2350.334469144" observedRunningTime="2025-10-02 11:34:34.157767617 +0000 UTC m=+2350.717675208" watchObservedRunningTime="2025-10-02 11:34:34.166123877 +0000 UTC m=+2350.726031458" Oct 02 11:34:35 crc kubenswrapper[4835]: I1002 11:34:35.153949 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" event={"ID":"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8","Type":"ContainerStarted","Data":"6c615acd6e70d8e26d787d3cae4b8dd27be001dee3e729a3cdc4b4436749c8c7"} Oct 02 11:34:35 crc kubenswrapper[4835]: I1002 11:34:35.179746 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" podStartSLOduration=1.760840697 podStartE2EDuration="2.179732164s" podCreationTimestamp="2025-10-02 11:34:33 +0000 UTC" firstStartedPulling="2025-10-02 11:34:34.030248229 +0000 UTC m=+2350.590155810" lastFinishedPulling="2025-10-02 11:34:34.449139706 +0000 UTC m=+2351.009047277" observedRunningTime="2025-10-02 11:34:35.17540668 +0000 UTC m=+2351.735314271" watchObservedRunningTime="2025-10-02 11:34:35.179732164 +0000 UTC m=+2351.739639745" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.011135 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-szqkj"] Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.014469 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.020190 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-szqkj"] Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.113533 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-utilities\") pod \"certified-operators-szqkj\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.113691 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hllh\" (UniqueName: \"kubernetes.io/projected/028f73f2-8c0b-42c8-ab03-9778518a1ae7-kube-api-access-9hllh\") pod \"certified-operators-szqkj\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.113740 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-catalog-content\") pod \"certified-operators-szqkj\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.214814 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hllh\" (UniqueName: \"kubernetes.io/projected/028f73f2-8c0b-42c8-ab03-9778518a1ae7-kube-api-access-9hllh\") pod \"certified-operators-szqkj\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.214887 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-catalog-content\") pod \"certified-operators-szqkj\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.214951 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-utilities\") pod \"certified-operators-szqkj\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.215464 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-utilities\") pod \"certified-operators-szqkj\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.215908 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-catalog-content\") pod \"certified-operators-szqkj\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.243079 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hllh\" (UniqueName: \"kubernetes.io/projected/028f73f2-8c0b-42c8-ab03-9778518a1ae7-kube-api-access-9hllh\") pod \"certified-operators-szqkj\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.340697 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:36 crc kubenswrapper[4835]: I1002 11:34:36.850155 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-szqkj"] Oct 02 11:34:37 crc kubenswrapper[4835]: I1002 11:34:37.170865 4835 generic.go:334] "Generic (PLEG): container finished" podID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" containerID="356c0e215fd21eeeb0fd3a92b23b21d59b1b02354501a688de9c829532431aad" exitCode=0 Oct 02 11:34:37 crc kubenswrapper[4835]: I1002 11:34:37.171093 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szqkj" event={"ID":"028f73f2-8c0b-42c8-ab03-9778518a1ae7","Type":"ContainerDied","Data":"356c0e215fd21eeeb0fd3a92b23b21d59b1b02354501a688de9c829532431aad"} Oct 02 11:34:37 crc kubenswrapper[4835]: I1002 11:34:37.171119 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szqkj" event={"ID":"028f73f2-8c0b-42c8-ab03-9778518a1ae7","Type":"ContainerStarted","Data":"ec86264d8f4d664644aaa209966df3bebb5be46a2396910b3d99e2836c7d0503"} Oct 02 11:34:38 crc kubenswrapper[4835]: I1002 11:34:38.180188 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szqkj" event={"ID":"028f73f2-8c0b-42c8-ab03-9778518a1ae7","Type":"ContainerStarted","Data":"53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272"} Oct 02 11:34:39 crc kubenswrapper[4835]: I1002 11:34:39.192484 4835 generic.go:334] "Generic (PLEG): container finished" podID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" containerID="53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272" exitCode=0 Oct 02 11:34:39 crc kubenswrapper[4835]: I1002 11:34:39.193184 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szqkj" event={"ID":"028f73f2-8c0b-42c8-ab03-9778518a1ae7","Type":"ContainerDied","Data":"53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272"} Oct 02 11:34:39 crc kubenswrapper[4835]: I1002 11:34:39.198139 4835 generic.go:334] "Generic (PLEG): container finished" podID="d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8" containerID="6c615acd6e70d8e26d787d3cae4b8dd27be001dee3e729a3cdc4b4436749c8c7" exitCode=0 Oct 02 11:34:39 crc kubenswrapper[4835]: I1002 11:34:39.198179 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" event={"ID":"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8","Type":"ContainerDied","Data":"6c615acd6e70d8e26d787d3cae4b8dd27be001dee3e729a3cdc4b4436749c8c7"} Oct 02 11:34:39 crc kubenswrapper[4835]: I1002 11:34:39.947689 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:39 crc kubenswrapper[4835]: I1002 11:34:39.948447 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:39 crc kubenswrapper[4835]: I1002 11:34:39.995894 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.206330 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szqkj" event={"ID":"028f73f2-8c0b-42c8-ab03-9778518a1ae7","Type":"ContainerStarted","Data":"ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b"} Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.236324 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-szqkj" podStartSLOduration=2.660535863 podStartE2EDuration="5.236307756s" podCreationTimestamp="2025-10-02 11:34:35 +0000 UTC" firstStartedPulling="2025-10-02 11:34:37.172946905 +0000 UTC m=+2353.732854486" lastFinishedPulling="2025-10-02 11:34:39.748718798 +0000 UTC m=+2356.308626379" observedRunningTime="2025-10-02 11:34:40.22773252 +0000 UTC m=+2356.787640111" watchObservedRunningTime="2025-10-02 11:34:40.236307756 +0000 UTC m=+2356.796215337" Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.262397 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.619187 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.792536 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-inventory\") pod \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.792589 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7xr2\" (UniqueName: \"kubernetes.io/projected/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-kube-api-access-q7xr2\") pod \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.792710 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ssh-key\") pod \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.792733 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ceph\") pod \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\" (UID: \"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8\") " Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.805589 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ceph" (OuterVolumeSpecName: "ceph") pod "d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8" (UID: "d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.805693 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-kube-api-access-q7xr2" (OuterVolumeSpecName: "kube-api-access-q7xr2") pod "d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8" (UID: "d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8"). InnerVolumeSpecName "kube-api-access-q7xr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.818844 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-inventory" (OuterVolumeSpecName: "inventory") pod "d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8" (UID: "d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.823864 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8" (UID: "d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.894633 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.894839 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.894897 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:40 crc kubenswrapper[4835]: I1002 11:34:40.894952 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7xr2\" (UniqueName: \"kubernetes.io/projected/d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8-kube-api-access-q7xr2\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.214593 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" event={"ID":"d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8","Type":"ContainerDied","Data":"eca14861949d897c7acc40d39f246a6145ef95273ce994275bf86d55e1b5688e"} Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.215652 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eca14861949d897c7acc40d39f246a6145ef95273ce994275bf86d55e1b5688e" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.214762 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.319727 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g"] Oct 02 11:34:41 crc kubenswrapper[4835]: E1002 11:34:41.320247 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.320270 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.320493 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.321426 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.325472 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.325472 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.329565 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.330127 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.330447 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.334556 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g"] Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.505539 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.505638 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmlp\" (UniqueName: \"kubernetes.io/projected/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-kube-api-access-ffmlp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.505695 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.505721 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.607835 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.608156 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmlp\" (UniqueName: \"kubernetes.io/projected/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-kube-api-access-ffmlp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.608392 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.608548 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.612675 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.613186 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.616455 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.628615 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmlp\" (UniqueName: \"kubernetes.io/projected/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-kube-api-access-ffmlp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8965g\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.640455 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.983855 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:34:41 crc kubenswrapper[4835]: I1002 11:34:41.984181 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:34:42 crc kubenswrapper[4835]: I1002 11:34:42.179494 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g"] Oct 02 11:34:42 crc kubenswrapper[4835]: W1002 11:34:42.187516 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329e8cb4_3d66_4c42_be6d_9cb71fdb008a.slice/crio-efba6f44826c6db57291e4b3b5c35ea7457448437e9da4b41125ca5bef3bb013 WatchSource:0}: Error finding container efba6f44826c6db57291e4b3b5c35ea7457448437e9da4b41125ca5bef3bb013: Status 404 returned error can't find the container with id efba6f44826c6db57291e4b3b5c35ea7457448437e9da4b41125ca5bef3bb013 Oct 02 11:34:42 crc kubenswrapper[4835]: I1002 11:34:42.222697 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" event={"ID":"329e8cb4-3d66-4c42-be6d-9cb71fdb008a","Type":"ContainerStarted","Data":"efba6f44826c6db57291e4b3b5c35ea7457448437e9da4b41125ca5bef3bb013"} Oct 02 11:34:42 crc kubenswrapper[4835]: I1002 11:34:42.400108 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-27mn6"] Oct 02 11:34:42 crc kubenswrapper[4835]: I1002 11:34:42.400419 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-27mn6" podUID="a5a1543b-05c0-41de-b492-c4a266793d2b" containerName="registry-server" containerID="cri-o://809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c" gracePeriod=2 Oct 02 11:34:42 crc kubenswrapper[4835]: I1002 11:34:42.763304 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:42 crc kubenswrapper[4835]: I1002 11:34:42.928604 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-catalog-content\") pod \"a5a1543b-05c0-41de-b492-c4a266793d2b\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " Oct 02 11:34:42 crc kubenswrapper[4835]: I1002 11:34:42.928647 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7tl8\" (UniqueName: \"kubernetes.io/projected/a5a1543b-05c0-41de-b492-c4a266793d2b-kube-api-access-f7tl8\") pod \"a5a1543b-05c0-41de-b492-c4a266793d2b\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " Oct 02 11:34:42 crc kubenswrapper[4835]: I1002 11:34:42.928839 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-utilities\") pod \"a5a1543b-05c0-41de-b492-c4a266793d2b\" (UID: \"a5a1543b-05c0-41de-b492-c4a266793d2b\") " Oct 02 11:34:42 crc kubenswrapper[4835]: I1002 11:34:42.929970 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-utilities" (OuterVolumeSpecName: "utilities") pod "a5a1543b-05c0-41de-b492-c4a266793d2b" (UID: "a5a1543b-05c0-41de-b492-c4a266793d2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:42 crc kubenswrapper[4835]: I1002 11:34:42.933100 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a1543b-05c0-41de-b492-c4a266793d2b-kube-api-access-f7tl8" (OuterVolumeSpecName: "kube-api-access-f7tl8") pod "a5a1543b-05c0-41de-b492-c4a266793d2b" (UID: "a5a1543b-05c0-41de-b492-c4a266793d2b"). InnerVolumeSpecName "kube-api-access-f7tl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.031521 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7tl8\" (UniqueName: \"kubernetes.io/projected/a5a1543b-05c0-41de-b492-c4a266793d2b-kube-api-access-f7tl8\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.031554 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.230880 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" event={"ID":"329e8cb4-3d66-4c42-be6d-9cb71fdb008a","Type":"ContainerStarted","Data":"28fa7609459949eb22ceddd379882e98c9f6696fdf252f7a75973e97095994b0"} Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.233808 4835 generic.go:334] "Generic (PLEG): container finished" podID="a5a1543b-05c0-41de-b492-c4a266793d2b" containerID="809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c" exitCode=0 Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.233860 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27mn6" event={"ID":"a5a1543b-05c0-41de-b492-c4a266793d2b","Type":"ContainerDied","Data":"809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c"} Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.233887 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27mn6" event={"ID":"a5a1543b-05c0-41de-b492-c4a266793d2b","Type":"ContainerDied","Data":"e0edbb1aed2ad152f2d170f610f2bd16a3b78d2c464e2759995072976b1c1b3b"} Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.233908 4835 scope.go:117] "RemoveContainer" containerID="809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.233865 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-27mn6" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.254525 4835 scope.go:117] "RemoveContainer" containerID="3b28ecce684048c40345876bd68396725e0bed3c8ad59578fd16cb2aea97cbb2" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.255050 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" podStartSLOduration=1.843410597 podStartE2EDuration="2.255032286s" podCreationTimestamp="2025-10-02 11:34:41 +0000 UTC" firstStartedPulling="2025-10-02 11:34:42.190016563 +0000 UTC m=+2358.749924144" lastFinishedPulling="2025-10-02 11:34:42.601638252 +0000 UTC m=+2359.161545833" observedRunningTime="2025-10-02 11:34:43.249100416 +0000 UTC m=+2359.809008017" watchObservedRunningTime="2025-10-02 11:34:43.255032286 +0000 UTC m=+2359.814939867" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.273527 4835 scope.go:117] "RemoveContainer" containerID="deab834fe891bb95d9b67ab49df9250a629bd5343d260521c2335d7379d36f05" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.311059 4835 scope.go:117] "RemoveContainer" containerID="809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c" Oct 02 11:34:43 crc kubenswrapper[4835]: E1002 11:34:43.313402 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c\": container with ID starting with 809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c not found: ID does not exist" containerID="809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.313438 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c"} err="failed to get container status \"809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c\": rpc error: code = NotFound desc = could not find container \"809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c\": container with ID starting with 809cdb4f3f43f74ff5318b7310464767bf3b877b2f396d89c33a9de2d5524a3c not found: ID does not exist" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.313461 4835 scope.go:117] "RemoveContainer" containerID="3b28ecce684048c40345876bd68396725e0bed3c8ad59578fd16cb2aea97cbb2" Oct 02 11:34:43 crc kubenswrapper[4835]: E1002 11:34:43.313757 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b28ecce684048c40345876bd68396725e0bed3c8ad59578fd16cb2aea97cbb2\": container with ID starting with 3b28ecce684048c40345876bd68396725e0bed3c8ad59578fd16cb2aea97cbb2 not found: ID does not exist" containerID="3b28ecce684048c40345876bd68396725e0bed3c8ad59578fd16cb2aea97cbb2" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.313787 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b28ecce684048c40345876bd68396725e0bed3c8ad59578fd16cb2aea97cbb2"} err="failed to get container status \"3b28ecce684048c40345876bd68396725e0bed3c8ad59578fd16cb2aea97cbb2\": rpc error: code = NotFound desc = could not find container \"3b28ecce684048c40345876bd68396725e0bed3c8ad59578fd16cb2aea97cbb2\": container with ID starting with 3b28ecce684048c40345876bd68396725e0bed3c8ad59578fd16cb2aea97cbb2 not found: ID does not exist" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.313800 4835 scope.go:117] "RemoveContainer" containerID="deab834fe891bb95d9b67ab49df9250a629bd5343d260521c2335d7379d36f05" Oct 02 11:34:43 crc kubenswrapper[4835]: E1002 11:34:43.314009 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deab834fe891bb95d9b67ab49df9250a629bd5343d260521c2335d7379d36f05\": container with ID starting with deab834fe891bb95d9b67ab49df9250a629bd5343d260521c2335d7379d36f05 not found: ID does not exist" containerID="deab834fe891bb95d9b67ab49df9250a629bd5343d260521c2335d7379d36f05" Oct 02 11:34:43 crc kubenswrapper[4835]: I1002 11:34:43.314041 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deab834fe891bb95d9b67ab49df9250a629bd5343d260521c2335d7379d36f05"} err="failed to get container status \"deab834fe891bb95d9b67ab49df9250a629bd5343d260521c2335d7379d36f05\": rpc error: code = NotFound desc = could not find container \"deab834fe891bb95d9b67ab49df9250a629bd5343d260521c2335d7379d36f05\": container with ID starting with deab834fe891bb95d9b67ab49df9250a629bd5343d260521c2335d7379d36f05 not found: ID does not exist" Oct 02 11:34:45 crc kubenswrapper[4835]: I1002 11:34:45.949646 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5a1543b-05c0-41de-b492-c4a266793d2b" (UID: "a5a1543b-05c0-41de-b492-c4a266793d2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:45 crc kubenswrapper[4835]: I1002 11:34:45.989522 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5a1543b-05c0-41de-b492-c4a266793d2b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:46 crc kubenswrapper[4835]: I1002 11:34:46.285748 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-27mn6"] Oct 02 11:34:46 crc kubenswrapper[4835]: I1002 11:34:46.292806 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-27mn6"] Oct 02 11:34:46 crc kubenswrapper[4835]: I1002 11:34:46.340986 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:46 crc kubenswrapper[4835]: I1002 11:34:46.341752 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:46 crc kubenswrapper[4835]: I1002 11:34:46.394812 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:47 crc kubenswrapper[4835]: I1002 11:34:47.366242 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:48 crc kubenswrapper[4835]: I1002 11:34:48.266160 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a1543b-05c0-41de-b492-c4a266793d2b" path="/var/lib/kubelet/pods/a5a1543b-05c0-41de-b492-c4a266793d2b/volumes" Oct 02 11:34:48 crc kubenswrapper[4835]: I1002 11:34:48.405766 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-szqkj"] Oct 02 11:34:50 crc kubenswrapper[4835]: I1002 11:34:50.338762 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-szqkj" podUID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" containerName="registry-server" containerID="cri-o://ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b" gracePeriod=2 Oct 02 11:34:50 crc kubenswrapper[4835]: I1002 11:34:50.830541 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:50 crc kubenswrapper[4835]: I1002 11:34:50.886259 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-utilities\") pod \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " Oct 02 11:34:50 crc kubenswrapper[4835]: I1002 11:34:50.886316 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hllh\" (UniqueName: \"kubernetes.io/projected/028f73f2-8c0b-42c8-ab03-9778518a1ae7-kube-api-access-9hllh\") pod \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " Oct 02 11:34:50 crc kubenswrapper[4835]: I1002 11:34:50.886503 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-catalog-content\") pod \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\" (UID: \"028f73f2-8c0b-42c8-ab03-9778518a1ae7\") " Oct 02 11:34:50 crc kubenswrapper[4835]: I1002 11:34:50.887151 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-utilities" (OuterVolumeSpecName: "utilities") pod "028f73f2-8c0b-42c8-ab03-9778518a1ae7" (UID: "028f73f2-8c0b-42c8-ab03-9778518a1ae7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:50 crc kubenswrapper[4835]: I1002 11:34:50.892015 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/028f73f2-8c0b-42c8-ab03-9778518a1ae7-kube-api-access-9hllh" (OuterVolumeSpecName: "kube-api-access-9hllh") pod "028f73f2-8c0b-42c8-ab03-9778518a1ae7" (UID: "028f73f2-8c0b-42c8-ab03-9778518a1ae7"). InnerVolumeSpecName "kube-api-access-9hllh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:50 crc kubenswrapper[4835]: I1002 11:34:50.944162 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "028f73f2-8c0b-42c8-ab03-9778518a1ae7" (UID: "028f73f2-8c0b-42c8-ab03-9778518a1ae7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:50 crc kubenswrapper[4835]: I1002 11:34:50.987980 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:50 crc kubenswrapper[4835]: I1002 11:34:50.988022 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028f73f2-8c0b-42c8-ab03-9778518a1ae7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:50 crc kubenswrapper[4835]: I1002 11:34:50.988041 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hllh\" (UniqueName: \"kubernetes.io/projected/028f73f2-8c0b-42c8-ab03-9778518a1ae7-kube-api-access-9hllh\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.349630 4835 generic.go:334] "Generic (PLEG): container finished" podID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" containerID="ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b" exitCode=0 Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.349676 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szqkj" event={"ID":"028f73f2-8c0b-42c8-ab03-9778518a1ae7","Type":"ContainerDied","Data":"ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b"} Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.349704 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szqkj" event={"ID":"028f73f2-8c0b-42c8-ab03-9778518a1ae7","Type":"ContainerDied","Data":"ec86264d8f4d664644aaa209966df3bebb5be46a2396910b3d99e2836c7d0503"} Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.349722 4835 scope.go:117] "RemoveContainer" containerID="ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b" Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.349758 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szqkj" Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.382791 4835 scope.go:117] "RemoveContainer" containerID="53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272" Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.402900 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-szqkj"] Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.409558 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-szqkj"] Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.418429 4835 scope.go:117] "RemoveContainer" containerID="356c0e215fd21eeeb0fd3a92b23b21d59b1b02354501a688de9c829532431aad" Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.452400 4835 scope.go:117] "RemoveContainer" containerID="ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b" Oct 02 11:34:51 crc kubenswrapper[4835]: E1002 11:34:51.452808 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b\": container with ID starting with ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b not found: ID does not exist" containerID="ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b" Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.452846 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b"} err="failed to get container status \"ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b\": rpc error: code = NotFound desc = could not find container \"ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b\": container with ID starting with ea0cdb07da327ae3e88ed981287a3270b61cf9e145d7b26d39206a3402845b7b not found: ID does not exist" Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.452867 4835 scope.go:117] "RemoveContainer" containerID="53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272" Oct 02 11:34:51 crc kubenswrapper[4835]: E1002 11:34:51.453197 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272\": container with ID starting with 53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272 not found: ID does not exist" containerID="53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272" Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.453299 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272"} err="failed to get container status \"53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272\": rpc error: code = NotFound desc = could not find container \"53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272\": container with ID starting with 53ef8e205bdaab5a9648e971292d7432fcf9b224d9d834986027b48f1e6cd272 not found: ID does not exist" Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.453354 4835 scope.go:117] "RemoveContainer" containerID="356c0e215fd21eeeb0fd3a92b23b21d59b1b02354501a688de9c829532431aad" Oct 02 11:34:51 crc kubenswrapper[4835]: E1002 11:34:51.453917 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"356c0e215fd21eeeb0fd3a92b23b21d59b1b02354501a688de9c829532431aad\": container with ID starting with 356c0e215fd21eeeb0fd3a92b23b21d59b1b02354501a688de9c829532431aad not found: ID does not exist" containerID="356c0e215fd21eeeb0fd3a92b23b21d59b1b02354501a688de9c829532431aad" Oct 02 11:34:51 crc kubenswrapper[4835]: I1002 11:34:51.453968 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"356c0e215fd21eeeb0fd3a92b23b21d59b1b02354501a688de9c829532431aad"} err="failed to get container status \"356c0e215fd21eeeb0fd3a92b23b21d59b1b02354501a688de9c829532431aad\": rpc error: code = NotFound desc = could not find container \"356c0e215fd21eeeb0fd3a92b23b21d59b1b02354501a688de9c829532431aad\": container with ID starting with 356c0e215fd21eeeb0fd3a92b23b21d59b1b02354501a688de9c829532431aad not found: ID does not exist" Oct 02 11:34:52 crc kubenswrapper[4835]: I1002 11:34:52.264083 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" path="/var/lib/kubelet/pods/028f73f2-8c0b-42c8-ab03-9778518a1ae7/volumes" Oct 02 11:35:11 crc kubenswrapper[4835]: I1002 11:35:11.984396 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:35:11 crc kubenswrapper[4835]: I1002 11:35:11.985075 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:35:14 crc kubenswrapper[4835]: I1002 11:35:14.549592 4835 generic.go:334] "Generic (PLEG): container finished" podID="329e8cb4-3d66-4c42-be6d-9cb71fdb008a" containerID="28fa7609459949eb22ceddd379882e98c9f6696fdf252f7a75973e97095994b0" exitCode=0 Oct 02 11:35:14 crc kubenswrapper[4835]: I1002 11:35:14.549695 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" event={"ID":"329e8cb4-3d66-4c42-be6d-9cb71fdb008a","Type":"ContainerDied","Data":"28fa7609459949eb22ceddd379882e98c9f6696fdf252f7a75973e97095994b0"} Oct 02 11:35:15 crc kubenswrapper[4835]: I1002 11:35:15.950670 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.135379 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ssh-key\") pod \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.136333 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ceph\") pod \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.136436 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-inventory\") pod \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.136587 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffmlp\" (UniqueName: \"kubernetes.io/projected/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-kube-api-access-ffmlp\") pod \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\" (UID: \"329e8cb4-3d66-4c42-be6d-9cb71fdb008a\") " Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.141140 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ceph" (OuterVolumeSpecName: "ceph") pod "329e8cb4-3d66-4c42-be6d-9cb71fdb008a" (UID: "329e8cb4-3d66-4c42-be6d-9cb71fdb008a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.148712 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-kube-api-access-ffmlp" (OuterVolumeSpecName: "kube-api-access-ffmlp") pod "329e8cb4-3d66-4c42-be6d-9cb71fdb008a" (UID: "329e8cb4-3d66-4c42-be6d-9cb71fdb008a"). InnerVolumeSpecName "kube-api-access-ffmlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.169553 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-inventory" (OuterVolumeSpecName: "inventory") pod "329e8cb4-3d66-4c42-be6d-9cb71fdb008a" (UID: "329e8cb4-3d66-4c42-be6d-9cb71fdb008a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.169942 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "329e8cb4-3d66-4c42-be6d-9cb71fdb008a" (UID: "329e8cb4-3d66-4c42-be6d-9cb71fdb008a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.238514 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.238586 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.238595 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.238604 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffmlp\" (UniqueName: \"kubernetes.io/projected/329e8cb4-3d66-4c42-be6d-9cb71fdb008a-kube-api-access-ffmlp\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.565621 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" event={"ID":"329e8cb4-3d66-4c42-be6d-9cb71fdb008a","Type":"ContainerDied","Data":"efba6f44826c6db57291e4b3b5c35ea7457448437e9da4b41125ca5bef3bb013"} Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.565909 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efba6f44826c6db57291e4b3b5c35ea7457448437e9da4b41125ca5bef3bb013" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.565696 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8965g" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.641281 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk"] Oct 02 11:35:16 crc kubenswrapper[4835]: E1002 11:35:16.641861 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" containerName="extract-content" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.641907 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" containerName="extract-content" Oct 02 11:35:16 crc kubenswrapper[4835]: E1002 11:35:16.641930 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" containerName="registry-server" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.641944 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" containerName="registry-server" Oct 02 11:35:16 crc kubenswrapper[4835]: E1002 11:35:16.642774 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a1543b-05c0-41de-b492-c4a266793d2b" containerName="registry-server" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.642832 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a1543b-05c0-41de-b492-c4a266793d2b" containerName="registry-server" Oct 02 11:35:16 crc kubenswrapper[4835]: E1002 11:35:16.642856 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" containerName="extract-utilities" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.642864 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" containerName="extract-utilities" Oct 02 11:35:16 crc kubenswrapper[4835]: E1002 11:35:16.642908 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a1543b-05c0-41de-b492-c4a266793d2b" containerName="extract-utilities" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.642915 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a1543b-05c0-41de-b492-c4a266793d2b" containerName="extract-utilities" Oct 02 11:35:16 crc kubenswrapper[4835]: E1002 11:35:16.642923 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329e8cb4-3d66-4c42-be6d-9cb71fdb008a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.642930 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="329e8cb4-3d66-4c42-be6d-9cb71fdb008a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:35:16 crc kubenswrapper[4835]: E1002 11:35:16.642940 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a1543b-05c0-41de-b492-c4a266793d2b" containerName="extract-content" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.642945 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a1543b-05c0-41de-b492-c4a266793d2b" containerName="extract-content" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.643309 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="028f73f2-8c0b-42c8-ab03-9778518a1ae7" containerName="registry-server" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.643337 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="329e8cb4-3d66-4c42-be6d-9cb71fdb008a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.643351 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a1543b-05c0-41de-b492-c4a266793d2b" containerName="registry-server" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.644187 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.647183 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.647545 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.647626 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.647995 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.648384 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk"] Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.648844 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.745529 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.745589 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.745677 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.745777 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8krw\" (UniqueName: \"kubernetes.io/projected/44c2bc95-3dad-486a-b15d-83158d9619d1-kube-api-access-j8krw\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.846968 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.847026 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.847086 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.847135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8krw\" (UniqueName: \"kubernetes.io/projected/44c2bc95-3dad-486a-b15d-83158d9619d1-kube-api-access-j8krw\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.853021 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.855264 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.856203 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.866158 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8krw\" (UniqueName: \"kubernetes.io/projected/44c2bc95-3dad-486a-b15d-83158d9619d1-kube-api-access-j8krw\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:16 crc kubenswrapper[4835]: I1002 11:35:16.963805 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:17 crc kubenswrapper[4835]: I1002 11:35:17.496155 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk"] Oct 02 11:35:17 crc kubenswrapper[4835]: I1002 11:35:17.574989 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" event={"ID":"44c2bc95-3dad-486a-b15d-83158d9619d1","Type":"ContainerStarted","Data":"b183dfc631e81868c13a11d23dccdbefc89b9061fd46433ac52ff11607f9aff7"} Oct 02 11:35:19 crc kubenswrapper[4835]: I1002 11:35:19.591179 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" event={"ID":"44c2bc95-3dad-486a-b15d-83158d9619d1","Type":"ContainerStarted","Data":"3892a4b605fc665703b4e843c9fa866dd2cfcd5e956f868947ba53a93226545e"} Oct 02 11:35:19 crc kubenswrapper[4835]: I1002 11:35:19.607360 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" podStartSLOduration=2.489348636 podStartE2EDuration="3.607337478s" podCreationTimestamp="2025-10-02 11:35:16 +0000 UTC" firstStartedPulling="2025-10-02 11:35:17.506901082 +0000 UTC m=+2394.066808663" lastFinishedPulling="2025-10-02 11:35:18.624889914 +0000 UTC m=+2395.184797505" observedRunningTime="2025-10-02 11:35:19.603409155 +0000 UTC m=+2396.163316756" watchObservedRunningTime="2025-10-02 11:35:19.607337478 +0000 UTC m=+2396.167245069" Oct 02 11:35:22 crc kubenswrapper[4835]: I1002 11:35:22.614615 4835 generic.go:334] "Generic (PLEG): container finished" podID="44c2bc95-3dad-486a-b15d-83158d9619d1" containerID="3892a4b605fc665703b4e843c9fa866dd2cfcd5e956f868947ba53a93226545e" exitCode=0 Oct 02 11:35:22 crc kubenswrapper[4835]: I1002 11:35:22.614840 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" event={"ID":"44c2bc95-3dad-486a-b15d-83158d9619d1","Type":"ContainerDied","Data":"3892a4b605fc665703b4e843c9fa866dd2cfcd5e956f868947ba53a93226545e"} Oct 02 11:35:23 crc kubenswrapper[4835]: I1002 11:35:23.998149 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.173563 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8krw\" (UniqueName: \"kubernetes.io/projected/44c2bc95-3dad-486a-b15d-83158d9619d1-kube-api-access-j8krw\") pod \"44c2bc95-3dad-486a-b15d-83158d9619d1\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.173676 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-inventory\") pod \"44c2bc95-3dad-486a-b15d-83158d9619d1\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.173795 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ssh-key\") pod \"44c2bc95-3dad-486a-b15d-83158d9619d1\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.174492 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ceph\") pod \"44c2bc95-3dad-486a-b15d-83158d9619d1\" (UID: \"44c2bc95-3dad-486a-b15d-83158d9619d1\") " Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.180918 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ceph" (OuterVolumeSpecName: "ceph") pod "44c2bc95-3dad-486a-b15d-83158d9619d1" (UID: "44c2bc95-3dad-486a-b15d-83158d9619d1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.185505 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c2bc95-3dad-486a-b15d-83158d9619d1-kube-api-access-j8krw" (OuterVolumeSpecName: "kube-api-access-j8krw") pod "44c2bc95-3dad-486a-b15d-83158d9619d1" (UID: "44c2bc95-3dad-486a-b15d-83158d9619d1"). InnerVolumeSpecName "kube-api-access-j8krw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.208206 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "44c2bc95-3dad-486a-b15d-83158d9619d1" (UID: "44c2bc95-3dad-486a-b15d-83158d9619d1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.210510 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-inventory" (OuterVolumeSpecName: "inventory") pod "44c2bc95-3dad-486a-b15d-83158d9619d1" (UID: "44c2bc95-3dad-486a-b15d-83158d9619d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.275960 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.276003 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8krw\" (UniqueName: \"kubernetes.io/projected/44c2bc95-3dad-486a-b15d-83158d9619d1-kube-api-access-j8krw\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.276018 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.276029 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44c2bc95-3dad-486a-b15d-83158d9619d1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.634696 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" event={"ID":"44c2bc95-3dad-486a-b15d-83158d9619d1","Type":"ContainerDied","Data":"b183dfc631e81868c13a11d23dccdbefc89b9061fd46433ac52ff11607f9aff7"} Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.634744 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b183dfc631e81868c13a11d23dccdbefc89b9061fd46433ac52ff11607f9aff7" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.634791 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.712964 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x"] Oct 02 11:35:24 crc kubenswrapper[4835]: E1002 11:35:24.713352 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c2bc95-3dad-486a-b15d-83158d9619d1" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.713370 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c2bc95-3dad-486a-b15d-83158d9619d1" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.713610 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c2bc95-3dad-486a-b15d-83158d9619d1" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.714277 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.716419 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.716649 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.721023 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.721343 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.721525 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.722959 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x"] Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.885797 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.886118 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.886157 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrprp\" (UniqueName: \"kubernetes.io/projected/08ff3367-a3dd-419e-a60e-1e4892b68d1c-kube-api-access-qrprp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.886226 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.991353 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.991598 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.991678 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.991793 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrprp\" (UniqueName: \"kubernetes.io/projected/08ff3367-a3dd-419e-a60e-1e4892b68d1c-kube-api-access-qrprp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:24 crc kubenswrapper[4835]: I1002 11:35:24.997577 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:25 crc kubenswrapper[4835]: I1002 11:35:25.005604 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:25 crc kubenswrapper[4835]: I1002 11:35:25.020067 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:25 crc kubenswrapper[4835]: I1002 11:35:25.024011 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrprp\" (UniqueName: \"kubernetes.io/projected/08ff3367-a3dd-419e-a60e-1e4892b68d1c-kube-api-access-qrprp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-27j6x\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:25 crc kubenswrapper[4835]: I1002 11:35:25.033993 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:35:25 crc kubenswrapper[4835]: I1002 11:35:25.544098 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x"] Oct 02 11:35:25 crc kubenswrapper[4835]: I1002 11:35:25.643891 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" event={"ID":"08ff3367-a3dd-419e-a60e-1e4892b68d1c","Type":"ContainerStarted","Data":"3b3b9066a693d8d833fe804cb2adfe5855b6a0a1fbfece0df4741f095785420c"} Oct 02 11:35:26 crc kubenswrapper[4835]: I1002 11:35:26.654784 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" event={"ID":"08ff3367-a3dd-419e-a60e-1e4892b68d1c","Type":"ContainerStarted","Data":"c1800b3243aa579cdbcacce7d98889da126d4bc57a49643042445c9743dc5143"} Oct 02 11:35:26 crc kubenswrapper[4835]: I1002 11:35:26.684297 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" podStartSLOduration=2.1597591720000002 podStartE2EDuration="2.684269669s" podCreationTimestamp="2025-10-02 11:35:24 +0000 UTC" firstStartedPulling="2025-10-02 11:35:25.552040698 +0000 UTC m=+2402.111948279" lastFinishedPulling="2025-10-02 11:35:26.076551195 +0000 UTC m=+2402.636458776" observedRunningTime="2025-10-02 11:35:26.673255733 +0000 UTC m=+2403.233163324" watchObservedRunningTime="2025-10-02 11:35:26.684269669 +0000 UTC m=+2403.244177290" Oct 02 11:35:41 crc kubenswrapper[4835]: I1002 11:35:41.984679 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:35:41 crc kubenswrapper[4835]: I1002 11:35:41.985339 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:35:41 crc kubenswrapper[4835]: I1002 11:35:41.985397 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:35:41 crc kubenswrapper[4835]: I1002 11:35:41.986274 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:35:41 crc kubenswrapper[4835]: I1002 11:35:41.986338 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" gracePeriod=600 Oct 02 11:35:42 crc kubenswrapper[4835]: E1002 11:35:42.116418 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:35:42 crc kubenswrapper[4835]: I1002 11:35:42.784253 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" exitCode=0 Oct 02 11:35:42 crc kubenswrapper[4835]: I1002 11:35:42.784311 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3"} Oct 02 11:35:42 crc kubenswrapper[4835]: I1002 11:35:42.784359 4835 scope.go:117] "RemoveContainer" containerID="73a074b76cb9c97dae601976ee29c575b71ca7b26055a59b51af688465758233" Oct 02 11:35:42 crc kubenswrapper[4835]: I1002 11:35:42.785313 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:35:42 crc kubenswrapper[4835]: E1002 11:35:42.785626 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:35:53 crc kubenswrapper[4835]: I1002 11:35:53.252140 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:35:53 crc kubenswrapper[4835]: E1002 11:35:53.253295 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:36:05 crc kubenswrapper[4835]: I1002 11:36:05.252385 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:36:05 crc kubenswrapper[4835]: E1002 11:36:05.254979 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:36:05 crc kubenswrapper[4835]: I1002 11:36:05.989722 4835 generic.go:334] "Generic (PLEG): container finished" podID="08ff3367-a3dd-419e-a60e-1e4892b68d1c" containerID="c1800b3243aa579cdbcacce7d98889da126d4bc57a49643042445c9743dc5143" exitCode=0 Oct 02 11:36:05 crc kubenswrapper[4835]: I1002 11:36:05.989806 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" event={"ID":"08ff3367-a3dd-419e-a60e-1e4892b68d1c","Type":"ContainerDied","Data":"c1800b3243aa579cdbcacce7d98889da126d4bc57a49643042445c9743dc5143"} Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.388808 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.471827 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ssh-key\") pod \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.471910 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrprp\" (UniqueName: \"kubernetes.io/projected/08ff3367-a3dd-419e-a60e-1e4892b68d1c-kube-api-access-qrprp\") pod \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.471957 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-inventory\") pod \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.472114 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ceph\") pod \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\" (UID: \"08ff3367-a3dd-419e-a60e-1e4892b68d1c\") " Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.477355 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ff3367-a3dd-419e-a60e-1e4892b68d1c-kube-api-access-qrprp" (OuterVolumeSpecName: "kube-api-access-qrprp") pod "08ff3367-a3dd-419e-a60e-1e4892b68d1c" (UID: "08ff3367-a3dd-419e-a60e-1e4892b68d1c"). InnerVolumeSpecName "kube-api-access-qrprp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.477409 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ceph" (OuterVolumeSpecName: "ceph") pod "08ff3367-a3dd-419e-a60e-1e4892b68d1c" (UID: "08ff3367-a3dd-419e-a60e-1e4892b68d1c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.502166 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-inventory" (OuterVolumeSpecName: "inventory") pod "08ff3367-a3dd-419e-a60e-1e4892b68d1c" (UID: "08ff3367-a3dd-419e-a60e-1e4892b68d1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.503807 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "08ff3367-a3dd-419e-a60e-1e4892b68d1c" (UID: "08ff3367-a3dd-419e-a60e-1e4892b68d1c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.574365 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.574406 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.574417 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrprp\" (UniqueName: \"kubernetes.io/projected/08ff3367-a3dd-419e-a60e-1e4892b68d1c-kube-api-access-qrprp\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:07 crc kubenswrapper[4835]: I1002 11:36:07.574429 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08ff3367-a3dd-419e-a60e-1e4892b68d1c-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.008898 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" event={"ID":"08ff3367-a3dd-419e-a60e-1e4892b68d1c","Type":"ContainerDied","Data":"3b3b9066a693d8d833fe804cb2adfe5855b6a0a1fbfece0df4741f095785420c"} Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.008956 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b3b9066a693d8d833fe804cb2adfe5855b6a0a1fbfece0df4741f095785420c" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.009542 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-27j6x" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.103099 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9p8hq"] Oct 02 11:36:08 crc kubenswrapper[4835]: E1002 11:36:08.103555 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ff3367-a3dd-419e-a60e-1e4892b68d1c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.103608 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ff3367-a3dd-419e-a60e-1e4892b68d1c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.103824 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ff3367-a3dd-419e-a60e-1e4892b68d1c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.104546 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.112666 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.112953 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.113369 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.113454 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.113523 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.114574 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9p8hq"] Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.193023 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ceph\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.193190 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.193253 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.193278 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg8dn\" (UniqueName: \"kubernetes.io/projected/e681c894-43b7-4a03-a987-7f7fb40f0754-kube-api-access-rg8dn\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.295800 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.295880 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.295981 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg8dn\" (UniqueName: \"kubernetes.io/projected/e681c894-43b7-4a03-a987-7f7fb40f0754-kube-api-access-rg8dn\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.296136 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ceph\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.301114 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.301724 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.302070 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ceph\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.316490 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg8dn\" (UniqueName: \"kubernetes.io/projected/e681c894-43b7-4a03-a987-7f7fb40f0754-kube-api-access-rg8dn\") pod \"ssh-known-hosts-edpm-deployment-9p8hq\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.432441 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:08 crc kubenswrapper[4835]: I1002 11:36:08.954762 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9p8hq"] Oct 02 11:36:09 crc kubenswrapper[4835]: I1002 11:36:09.016851 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" event={"ID":"e681c894-43b7-4a03-a987-7f7fb40f0754","Type":"ContainerStarted","Data":"e64383ae95c80d09fd1fcec4a5ebaaf31f065b6a25dd098311cd6cfc2f64a7e3"} Oct 02 11:36:10 crc kubenswrapper[4835]: I1002 11:36:10.028599 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" event={"ID":"e681c894-43b7-4a03-a987-7f7fb40f0754","Type":"ContainerStarted","Data":"9c310c5875906a5c5e33f705744a789a0dc26d83a13dcaaf8bc693e9634c2072"} Oct 02 11:36:10 crc kubenswrapper[4835]: I1002 11:36:10.050504 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" podStartSLOduration=1.613699494 podStartE2EDuration="2.050485884s" podCreationTimestamp="2025-10-02 11:36:08 +0000 UTC" firstStartedPulling="2025-10-02 11:36:08.964737306 +0000 UTC m=+2445.524644887" lastFinishedPulling="2025-10-02 11:36:09.401523696 +0000 UTC m=+2445.961431277" observedRunningTime="2025-10-02 11:36:10.047508288 +0000 UTC m=+2446.607415879" watchObservedRunningTime="2025-10-02 11:36:10.050485884 +0000 UTC m=+2446.610393465" Oct 02 11:36:18 crc kubenswrapper[4835]: I1002 11:36:18.114322 4835 generic.go:334] "Generic (PLEG): container finished" podID="e681c894-43b7-4a03-a987-7f7fb40f0754" containerID="9c310c5875906a5c5e33f705744a789a0dc26d83a13dcaaf8bc693e9634c2072" exitCode=0 Oct 02 11:36:18 crc kubenswrapper[4835]: I1002 11:36:18.114436 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" event={"ID":"e681c894-43b7-4a03-a987-7f7fb40f0754","Type":"ContainerDied","Data":"9c310c5875906a5c5e33f705744a789a0dc26d83a13dcaaf8bc693e9634c2072"} Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.253233 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:36:19 crc kubenswrapper[4835]: E1002 11:36:19.253535 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.503829 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.605015 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ceph\") pod \"e681c894-43b7-4a03-a987-7f7fb40f0754\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.605086 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ssh-key-openstack-edpm-ipam\") pod \"e681c894-43b7-4a03-a987-7f7fb40f0754\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.605189 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-inventory-0\") pod \"e681c894-43b7-4a03-a987-7f7fb40f0754\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.605265 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg8dn\" (UniqueName: \"kubernetes.io/projected/e681c894-43b7-4a03-a987-7f7fb40f0754-kube-api-access-rg8dn\") pod \"e681c894-43b7-4a03-a987-7f7fb40f0754\" (UID: \"e681c894-43b7-4a03-a987-7f7fb40f0754\") " Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.611766 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ceph" (OuterVolumeSpecName: "ceph") pod "e681c894-43b7-4a03-a987-7f7fb40f0754" (UID: "e681c894-43b7-4a03-a987-7f7fb40f0754"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.612142 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e681c894-43b7-4a03-a987-7f7fb40f0754-kube-api-access-rg8dn" (OuterVolumeSpecName: "kube-api-access-rg8dn") pod "e681c894-43b7-4a03-a987-7f7fb40f0754" (UID: "e681c894-43b7-4a03-a987-7f7fb40f0754"). InnerVolumeSpecName "kube-api-access-rg8dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.632973 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "e681c894-43b7-4a03-a987-7f7fb40f0754" (UID: "e681c894-43b7-4a03-a987-7f7fb40f0754"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.633542 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e681c894-43b7-4a03-a987-7f7fb40f0754" (UID: "e681c894-43b7-4a03-a987-7f7fb40f0754"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.708322 4835 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.708361 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg8dn\" (UniqueName: \"kubernetes.io/projected/e681c894-43b7-4a03-a987-7f7fb40f0754-kube-api-access-rg8dn\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.708374 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:19 crc kubenswrapper[4835]: I1002 11:36:19.708387 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e681c894-43b7-4a03-a987-7f7fb40f0754-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.135721 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" event={"ID":"e681c894-43b7-4a03-a987-7f7fb40f0754","Type":"ContainerDied","Data":"e64383ae95c80d09fd1fcec4a5ebaaf31f065b6a25dd098311cd6cfc2f64a7e3"} Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.135761 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e64383ae95c80d09fd1fcec4a5ebaaf31f065b6a25dd098311cd6cfc2f64a7e3" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.135826 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9p8hq" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.211118 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf"] Oct 02 11:36:20 crc kubenswrapper[4835]: E1002 11:36:20.211552 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e681c894-43b7-4a03-a987-7f7fb40f0754" containerName="ssh-known-hosts-edpm-deployment" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.211570 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e681c894-43b7-4a03-a987-7f7fb40f0754" containerName="ssh-known-hosts-edpm-deployment" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.211769 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e681c894-43b7-4a03-a987-7f7fb40f0754" containerName="ssh-known-hosts-edpm-deployment" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.212410 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.219490 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.219490 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.219546 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.220400 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.220528 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.221650 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf"] Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.321093 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.321376 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.321531 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.321605 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gtg2\" (UniqueName: \"kubernetes.io/projected/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-kube-api-access-2gtg2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.422999 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.423377 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gtg2\" (UniqueName: \"kubernetes.io/projected/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-kube-api-access-2gtg2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.423405 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.423460 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.429084 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.429102 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.436886 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.440530 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gtg2\" (UniqueName: \"kubernetes.io/projected/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-kube-api-access-2gtg2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7nvnf\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:20 crc kubenswrapper[4835]: I1002 11:36:20.541062 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:21 crc kubenswrapper[4835]: I1002 11:36:21.089398 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf"] Oct 02 11:36:21 crc kubenswrapper[4835]: I1002 11:36:21.148778 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" event={"ID":"a12bde0e-0dbc-4c10-a40f-89a3f77690d3","Type":"ContainerStarted","Data":"57f07ddc51201cb686292dab76294655906a280d6fdf423e717182c0f1df4373"} Oct 02 11:36:22 crc kubenswrapper[4835]: I1002 11:36:22.163469 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" event={"ID":"a12bde0e-0dbc-4c10-a40f-89a3f77690d3","Type":"ContainerStarted","Data":"b2f4e54734af51e6f7fcd2d23fd241c928bd95911851d9718b0914a73b5eb09b"} Oct 02 11:36:22 crc kubenswrapper[4835]: I1002 11:36:22.185949 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" podStartSLOduration=1.6112231449999999 podStartE2EDuration="2.185898581s" podCreationTimestamp="2025-10-02 11:36:20 +0000 UTC" firstStartedPulling="2025-10-02 11:36:21.094943225 +0000 UTC m=+2457.654850806" lastFinishedPulling="2025-10-02 11:36:21.669618661 +0000 UTC m=+2458.229526242" observedRunningTime="2025-10-02 11:36:22.180783695 +0000 UTC m=+2458.740691276" watchObservedRunningTime="2025-10-02 11:36:22.185898581 +0000 UTC m=+2458.745806162" Oct 02 11:36:29 crc kubenswrapper[4835]: I1002 11:36:29.214999 4835 generic.go:334] "Generic (PLEG): container finished" podID="a12bde0e-0dbc-4c10-a40f-89a3f77690d3" containerID="b2f4e54734af51e6f7fcd2d23fd241c928bd95911851d9718b0914a73b5eb09b" exitCode=0 Oct 02 11:36:29 crc kubenswrapper[4835]: I1002 11:36:29.215133 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" event={"ID":"a12bde0e-0dbc-4c10-a40f-89a3f77690d3","Type":"ContainerDied","Data":"b2f4e54734af51e6f7fcd2d23fd241c928bd95911851d9718b0914a73b5eb09b"} Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.629742 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.700383 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ceph\") pod \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.700557 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-inventory\") pod \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.700599 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ssh-key\") pod \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.700702 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gtg2\" (UniqueName: \"kubernetes.io/projected/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-kube-api-access-2gtg2\") pod \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\" (UID: \"a12bde0e-0dbc-4c10-a40f-89a3f77690d3\") " Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.712751 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-kube-api-access-2gtg2" (OuterVolumeSpecName: "kube-api-access-2gtg2") pod "a12bde0e-0dbc-4c10-a40f-89a3f77690d3" (UID: "a12bde0e-0dbc-4c10-a40f-89a3f77690d3"). InnerVolumeSpecName "kube-api-access-2gtg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.719155 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ceph" (OuterVolumeSpecName: "ceph") pod "a12bde0e-0dbc-4c10-a40f-89a3f77690d3" (UID: "a12bde0e-0dbc-4c10-a40f-89a3f77690d3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.736413 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a12bde0e-0dbc-4c10-a40f-89a3f77690d3" (UID: "a12bde0e-0dbc-4c10-a40f-89a3f77690d3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.747478 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-inventory" (OuterVolumeSpecName: "inventory") pod "a12bde0e-0dbc-4c10-a40f-89a3f77690d3" (UID: "a12bde0e-0dbc-4c10-a40f-89a3f77690d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.801948 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.801986 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gtg2\" (UniqueName: \"kubernetes.io/projected/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-kube-api-access-2gtg2\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.801998 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:30 crc kubenswrapper[4835]: I1002 11:36:30.802007 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a12bde0e-0dbc-4c10-a40f-89a3f77690d3-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.235423 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" event={"ID":"a12bde0e-0dbc-4c10-a40f-89a3f77690d3","Type":"ContainerDied","Data":"57f07ddc51201cb686292dab76294655906a280d6fdf423e717182c0f1df4373"} Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.235748 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57f07ddc51201cb686292dab76294655906a280d6fdf423e717182c0f1df4373" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.235559 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7nvnf" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.252420 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:36:31 crc kubenswrapper[4835]: E1002 11:36:31.252657 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.313744 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk"] Oct 02 11:36:31 crc kubenswrapper[4835]: E1002 11:36:31.314163 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12bde0e-0dbc-4c10-a40f-89a3f77690d3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.314180 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12bde0e-0dbc-4c10-a40f-89a3f77690d3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.314441 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12bde0e-0dbc-4c10-a40f-89a3f77690d3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.315130 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.319728 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk"] Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.322957 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.323249 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.323409 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.323573 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.328556 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.412437 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.412506 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b25f\" (UniqueName: \"kubernetes.io/projected/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-kube-api-access-7b25f\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.412718 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.412871 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.514963 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.515082 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b25f\" (UniqueName: \"kubernetes.io/projected/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-kube-api-access-7b25f\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.515115 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.515206 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.520575 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.520848 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.530699 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.534505 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b25f\" (UniqueName: \"kubernetes.io/projected/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-kube-api-access-7b25f\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:31 crc kubenswrapper[4835]: I1002 11:36:31.643460 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:32 crc kubenswrapper[4835]: I1002 11:36:32.204244 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk"] Oct 02 11:36:32 crc kubenswrapper[4835]: I1002 11:36:32.245675 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" event={"ID":"f731fa22-ff01-4d11-9cd5-166d6b2d54fb","Type":"ContainerStarted","Data":"d197a21637d07c922fc5372f41d13b580a3252c7b2d9fcf132e08867b36cc6f5"} Oct 02 11:36:33 crc kubenswrapper[4835]: I1002 11:36:33.254210 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" event={"ID":"f731fa22-ff01-4d11-9cd5-166d6b2d54fb","Type":"ContainerStarted","Data":"d31417e53f04aa8fb074872c15b22bc2b8179c197501bc983a3163fd753d3a50"} Oct 02 11:36:33 crc kubenswrapper[4835]: I1002 11:36:33.280911 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" podStartSLOduration=1.833665731 podStartE2EDuration="2.280884961s" podCreationTimestamp="2025-10-02 11:36:31 +0000 UTC" firstStartedPulling="2025-10-02 11:36:32.210956328 +0000 UTC m=+2468.770863909" lastFinishedPulling="2025-10-02 11:36:32.658175558 +0000 UTC m=+2469.218083139" observedRunningTime="2025-10-02 11:36:33.271071169 +0000 UTC m=+2469.830978750" watchObservedRunningTime="2025-10-02 11:36:33.280884961 +0000 UTC m=+2469.840792542" Oct 02 11:36:42 crc kubenswrapper[4835]: I1002 11:36:42.336555 4835 generic.go:334] "Generic (PLEG): container finished" podID="f731fa22-ff01-4d11-9cd5-166d6b2d54fb" containerID="d31417e53f04aa8fb074872c15b22bc2b8179c197501bc983a3163fd753d3a50" exitCode=0 Oct 02 11:36:42 crc kubenswrapper[4835]: I1002 11:36:42.336646 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" event={"ID":"f731fa22-ff01-4d11-9cd5-166d6b2d54fb","Type":"ContainerDied","Data":"d31417e53f04aa8fb074872c15b22bc2b8179c197501bc983a3163fd753d3a50"} Oct 02 11:36:43 crc kubenswrapper[4835]: I1002 11:36:43.252368 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:36:43 crc kubenswrapper[4835]: E1002 11:36:43.252815 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:36:43 crc kubenswrapper[4835]: I1002 11:36:43.767739 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:43 crc kubenswrapper[4835]: I1002 11:36:43.934758 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ssh-key\") pod \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " Oct 02 11:36:43 crc kubenswrapper[4835]: I1002 11:36:43.935038 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-inventory\") pod \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " Oct 02 11:36:43 crc kubenswrapper[4835]: I1002 11:36:43.935105 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b25f\" (UniqueName: \"kubernetes.io/projected/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-kube-api-access-7b25f\") pod \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " Oct 02 11:36:43 crc kubenswrapper[4835]: I1002 11:36:43.935152 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ceph\") pod \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\" (UID: \"f731fa22-ff01-4d11-9cd5-166d6b2d54fb\") " Oct 02 11:36:43 crc kubenswrapper[4835]: I1002 11:36:43.940574 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ceph" (OuterVolumeSpecName: "ceph") pod "f731fa22-ff01-4d11-9cd5-166d6b2d54fb" (UID: "f731fa22-ff01-4d11-9cd5-166d6b2d54fb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:43 crc kubenswrapper[4835]: I1002 11:36:43.950489 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-kube-api-access-7b25f" (OuterVolumeSpecName: "kube-api-access-7b25f") pod "f731fa22-ff01-4d11-9cd5-166d6b2d54fb" (UID: "f731fa22-ff01-4d11-9cd5-166d6b2d54fb"). InnerVolumeSpecName "kube-api-access-7b25f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:43 crc kubenswrapper[4835]: I1002 11:36:43.968173 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-inventory" (OuterVolumeSpecName: "inventory") pod "f731fa22-ff01-4d11-9cd5-166d6b2d54fb" (UID: "f731fa22-ff01-4d11-9cd5-166d6b2d54fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:43 crc kubenswrapper[4835]: I1002 11:36:43.972767 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f731fa22-ff01-4d11-9cd5-166d6b2d54fb" (UID: "f731fa22-ff01-4d11-9cd5-166d6b2d54fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.037031 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.037075 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b25f\" (UniqueName: \"kubernetes.io/projected/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-kube-api-access-7b25f\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.037090 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.037103 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f731fa22-ff01-4d11-9cd5-166d6b2d54fb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.355923 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" event={"ID":"f731fa22-ff01-4d11-9cd5-166d6b2d54fb","Type":"ContainerDied","Data":"d197a21637d07c922fc5372f41d13b580a3252c7b2d9fcf132e08867b36cc6f5"} Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.355965 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d197a21637d07c922fc5372f41d13b580a3252c7b2d9fcf132e08867b36cc6f5" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.355987 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.437969 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh"] Oct 02 11:36:44 crc kubenswrapper[4835]: E1002 11:36:44.438627 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f731fa22-ff01-4d11-9cd5-166d6b2d54fb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.438647 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f731fa22-ff01-4d11-9cd5-166d6b2d54fb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.438811 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f731fa22-ff01-4d11-9cd5-166d6b2d54fb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.439376 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.441487 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.441600 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.441778 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.441782 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.442716 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.442759 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.442714 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.443035 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.445891 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk6px\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-kube-api-access-pk6px\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.445947 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.446028 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.446059 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.446089 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.446135 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.446268 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.446348 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.446435 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.446493 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.446663 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.446799 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.446961 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.455540 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh"] Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.548629 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.548718 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.548754 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk6px\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-kube-api-access-pk6px\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.548782 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.548811 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.548829 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.548976 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.549013 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.549489 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.549528 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.549561 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.549584 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.549645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.553751 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.553805 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.554298 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.554388 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.554526 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.554897 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.555070 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.555075 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.556923 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.557016 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.557130 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.557293 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.565445 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk6px\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-kube-api-access-pk6px\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:44 crc kubenswrapper[4835]: I1002 11:36:44.755740 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:36:45 crc kubenswrapper[4835]: I1002 11:36:45.282035 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh"] Oct 02 11:36:45 crc kubenswrapper[4835]: W1002 11:36:45.290181 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b93c415_6ba0_4183_b6b1_b47166ed39f1.slice/crio-2335b0f792fb872f0d6dd10be7df488032914aaaa7cc4467c824bdb987f4e13f WatchSource:0}: Error finding container 2335b0f792fb872f0d6dd10be7df488032914aaaa7cc4467c824bdb987f4e13f: Status 404 returned error can't find the container with id 2335b0f792fb872f0d6dd10be7df488032914aaaa7cc4467c824bdb987f4e13f Oct 02 11:36:45 crc kubenswrapper[4835]: I1002 11:36:45.363923 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" event={"ID":"6b93c415-6ba0-4183-b6b1-b47166ed39f1","Type":"ContainerStarted","Data":"2335b0f792fb872f0d6dd10be7df488032914aaaa7cc4467c824bdb987f4e13f"} Oct 02 11:36:46 crc kubenswrapper[4835]: I1002 11:36:46.373646 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" event={"ID":"6b93c415-6ba0-4183-b6b1-b47166ed39f1","Type":"ContainerStarted","Data":"98f39e17cd1ae9b556440bd521a8fd28f5918f8269c445af38d4ff496cc961f5"} Oct 02 11:36:46 crc kubenswrapper[4835]: I1002 11:36:46.391963 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" podStartSLOduration=1.844830611 podStartE2EDuration="2.391945727s" podCreationTimestamp="2025-10-02 11:36:44 +0000 UTC" firstStartedPulling="2025-10-02 11:36:45.293059872 +0000 UTC m=+2481.852967453" lastFinishedPulling="2025-10-02 11:36:45.840174988 +0000 UTC m=+2482.400082569" observedRunningTime="2025-10-02 11:36:46.389830726 +0000 UTC m=+2482.949738317" watchObservedRunningTime="2025-10-02 11:36:46.391945727 +0000 UTC m=+2482.951853308" Oct 02 11:36:58 crc kubenswrapper[4835]: I1002 11:36:58.251788 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:36:58 crc kubenswrapper[4835]: E1002 11:36:58.252531 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:37:10 crc kubenswrapper[4835]: I1002 11:37:10.252542 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:37:10 crc kubenswrapper[4835]: E1002 11:37:10.253338 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:37:15 crc kubenswrapper[4835]: I1002 11:37:15.599520 4835 generic.go:334] "Generic (PLEG): container finished" podID="6b93c415-6ba0-4183-b6b1-b47166ed39f1" containerID="98f39e17cd1ae9b556440bd521a8fd28f5918f8269c445af38d4ff496cc961f5" exitCode=0 Oct 02 11:37:15 crc kubenswrapper[4835]: I1002 11:37:15.599623 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" event={"ID":"6b93c415-6ba0-4183-b6b1-b47166ed39f1","Type":"ContainerDied","Data":"98f39e17cd1ae9b556440bd521a8fd28f5918f8269c445af38d4ff496cc961f5"} Oct 02 11:37:16 crc kubenswrapper[4835]: I1002 11:37:16.962580 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.001396 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-repo-setup-combined-ca-bundle\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.001487 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ceph\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.001524 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-libvirt-combined-ca-bundle\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.001586 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.001615 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-inventory\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.002483 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ovn-combined-ca-bundle\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.002509 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-bootstrap-combined-ca-bundle\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.002576 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-neutron-metadata-combined-ca-bundle\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.002634 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ssh-key\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.002673 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-nova-combined-ca-bundle\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.002706 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.002768 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk6px\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-kube-api-access-pk6px\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.002803 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\" (UID: \"6b93c415-6ba0-4183-b6b1-b47166ed39f1\") " Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.007139 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.007697 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.007741 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.007756 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ceph" (OuterVolumeSpecName: "ceph") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.008409 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.009318 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.010006 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.010692 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.012336 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-kube-api-access-pk6px" (OuterVolumeSpecName: "kube-api-access-pk6px") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "kube-api-access-pk6px". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.014553 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.016576 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.030206 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-inventory" (OuterVolumeSpecName: "inventory") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.032202 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6b93c415-6ba0-4183-b6b1-b47166ed39f1" (UID: "6b93c415-6ba0-4183-b6b1-b47166ed39f1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105099 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk6px\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-kube-api-access-pk6px\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105136 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105172 4835 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105182 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105195 4835 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105206 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105231 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105246 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105258 4835 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105266 4835 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105274 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105285 4835 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b93c415-6ba0-4183-b6b1-b47166ed39f1-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.105292 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6b93c415-6ba0-4183-b6b1-b47166ed39f1-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.617408 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" event={"ID":"6b93c415-6ba0-4183-b6b1-b47166ed39f1","Type":"ContainerDied","Data":"2335b0f792fb872f0d6dd10be7df488032914aaaa7cc4467c824bdb987f4e13f"} Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.617453 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2335b0f792fb872f0d6dd10be7df488032914aaaa7cc4467c824bdb987f4e13f" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.617485 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.718237 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px"] Oct 02 11:37:17 crc kubenswrapper[4835]: E1002 11:37:17.718580 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b93c415-6ba0-4183-b6b1-b47166ed39f1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.718598 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b93c415-6ba0-4183-b6b1-b47166ed39f1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.718777 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b93c415-6ba0-4183-b6b1-b47166ed39f1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.719388 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.723000 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.723008 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.723322 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.723367 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.726802 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.729110 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px"] Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.818702 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.818858 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.819212 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r78w\" (UniqueName: \"kubernetes.io/projected/7e2bad15-a601-45d3-96df-fe89c132053d-kube-api-access-2r78w\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.819352 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.921104 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.921254 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r78w\" (UniqueName: \"kubernetes.io/projected/7e2bad15-a601-45d3-96df-fe89c132053d-kube-api-access-2r78w\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.921296 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.921378 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.928087 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.941767 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.943006 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:17 crc kubenswrapper[4835]: I1002 11:37:17.944348 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r78w\" (UniqueName: \"kubernetes.io/projected/7e2bad15-a601-45d3-96df-fe89c132053d-kube-api-access-2r78w\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:18 crc kubenswrapper[4835]: I1002 11:37:18.036803 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:18 crc kubenswrapper[4835]: I1002 11:37:18.553834 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px"] Oct 02 11:37:18 crc kubenswrapper[4835]: I1002 11:37:18.560074 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:37:18 crc kubenswrapper[4835]: I1002 11:37:18.627903 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" event={"ID":"7e2bad15-a601-45d3-96df-fe89c132053d","Type":"ContainerStarted","Data":"2b67f508bcaad19415c624d22006ad05f7758324047b20bd5dea399bacdfca8e"} Oct 02 11:37:19 crc kubenswrapper[4835]: I1002 11:37:19.637379 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" event={"ID":"7e2bad15-a601-45d3-96df-fe89c132053d","Type":"ContainerStarted","Data":"0dddd7b002a82ef6cdf390de20856871571a8237a8879390f04fcbaf35a058ad"} Oct 02 11:37:19 crc kubenswrapper[4835]: I1002 11:37:19.663799 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" podStartSLOduration=1.958247219 podStartE2EDuration="2.663780459s" podCreationTimestamp="2025-10-02 11:37:17 +0000 UTC" firstStartedPulling="2025-10-02 11:37:18.559716796 +0000 UTC m=+2515.119624377" lastFinishedPulling="2025-10-02 11:37:19.265250036 +0000 UTC m=+2515.825157617" observedRunningTime="2025-10-02 11:37:19.658125506 +0000 UTC m=+2516.218033087" watchObservedRunningTime="2025-10-02 11:37:19.663780459 +0000 UTC m=+2516.223688040" Oct 02 11:37:24 crc kubenswrapper[4835]: I1002 11:37:24.692724 4835 generic.go:334] "Generic (PLEG): container finished" podID="7e2bad15-a601-45d3-96df-fe89c132053d" containerID="0dddd7b002a82ef6cdf390de20856871571a8237a8879390f04fcbaf35a058ad" exitCode=0 Oct 02 11:37:24 crc kubenswrapper[4835]: I1002 11:37:24.692819 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" event={"ID":"7e2bad15-a601-45d3-96df-fe89c132053d","Type":"ContainerDied","Data":"0dddd7b002a82ef6cdf390de20856871571a8237a8879390f04fcbaf35a058ad"} Oct 02 11:37:25 crc kubenswrapper[4835]: I1002 11:37:25.251346 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:37:25 crc kubenswrapper[4835]: E1002 11:37:25.251658 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.132408 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.168052 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ceph\") pod \"7e2bad15-a601-45d3-96df-fe89c132053d\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.168213 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ssh-key\") pod \"7e2bad15-a601-45d3-96df-fe89c132053d\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.168277 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-inventory\") pod \"7e2bad15-a601-45d3-96df-fe89c132053d\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.168309 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r78w\" (UniqueName: \"kubernetes.io/projected/7e2bad15-a601-45d3-96df-fe89c132053d-kube-api-access-2r78w\") pod \"7e2bad15-a601-45d3-96df-fe89c132053d\" (UID: \"7e2bad15-a601-45d3-96df-fe89c132053d\") " Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.180378 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2bad15-a601-45d3-96df-fe89c132053d-kube-api-access-2r78w" (OuterVolumeSpecName: "kube-api-access-2r78w") pod "7e2bad15-a601-45d3-96df-fe89c132053d" (UID: "7e2bad15-a601-45d3-96df-fe89c132053d"). InnerVolumeSpecName "kube-api-access-2r78w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.181450 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ceph" (OuterVolumeSpecName: "ceph") pod "7e2bad15-a601-45d3-96df-fe89c132053d" (UID: "7e2bad15-a601-45d3-96df-fe89c132053d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.203553 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7e2bad15-a601-45d3-96df-fe89c132053d" (UID: "7e2bad15-a601-45d3-96df-fe89c132053d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.217130 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-inventory" (OuterVolumeSpecName: "inventory") pod "7e2bad15-a601-45d3-96df-fe89c132053d" (UID: "7e2bad15-a601-45d3-96df-fe89c132053d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.270264 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.270308 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.270324 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e2bad15-a601-45d3-96df-fe89c132053d-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.270339 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r78w\" (UniqueName: \"kubernetes.io/projected/7e2bad15-a601-45d3-96df-fe89c132053d-kube-api-access-2r78w\") on node \"crc\" DevicePath \"\"" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.710689 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" event={"ID":"7e2bad15-a601-45d3-96df-fe89c132053d","Type":"ContainerDied","Data":"2b67f508bcaad19415c624d22006ad05f7758324047b20bd5dea399bacdfca8e"} Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.710729 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b67f508bcaad19415c624d22006ad05f7758324047b20bd5dea399bacdfca8e" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.710808 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.797573 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw"] Oct 02 11:37:26 crc kubenswrapper[4835]: E1002 11:37:26.798041 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2bad15-a601-45d3-96df-fe89c132053d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.798063 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2bad15-a601-45d3-96df-fe89c132053d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.798277 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2bad15-a601-45d3-96df-fe89c132053d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.799003 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.801567 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.801601 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.801619 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.801567 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.801979 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.803807 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.813852 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw"] Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.880211 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.880508 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af586fd6-b857-4897-8b19-4d57315fba61-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.880573 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.880599 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.880627 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.880661 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zm4n\" (UniqueName: \"kubernetes.io/projected/af586fd6-b857-4897-8b19-4d57315fba61-kube-api-access-2zm4n\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.981623 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.981702 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zm4n\" (UniqueName: \"kubernetes.io/projected/af586fd6-b857-4897-8b19-4d57315fba61-kube-api-access-2zm4n\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.981782 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.981824 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af586fd6-b857-4897-8b19-4d57315fba61-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.981886 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.981921 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.983124 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af586fd6-b857-4897-8b19-4d57315fba61-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.985839 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.986660 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.993032 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:26 crc kubenswrapper[4835]: I1002 11:37:26.993564 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:27 crc kubenswrapper[4835]: I1002 11:37:27.007993 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zm4n\" (UniqueName: \"kubernetes.io/projected/af586fd6-b857-4897-8b19-4d57315fba61-kube-api-access-2zm4n\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvqw\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:27 crc kubenswrapper[4835]: I1002 11:37:27.118732 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:37:27 crc kubenswrapper[4835]: I1002 11:37:27.625102 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw"] Oct 02 11:37:27 crc kubenswrapper[4835]: I1002 11:37:27.720699 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" event={"ID":"af586fd6-b857-4897-8b19-4d57315fba61","Type":"ContainerStarted","Data":"9a41c6e2b11e769a2c2cfef6c15b6b6ada71722449ce34a8568f9275a1efce0b"} Oct 02 11:37:29 crc kubenswrapper[4835]: I1002 11:37:29.741693 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" event={"ID":"af586fd6-b857-4897-8b19-4d57315fba61","Type":"ContainerStarted","Data":"9d9714ea874cd13d486e4e7506dce7a6fb9cb8d294afe8d07716c88372125889"} Oct 02 11:37:29 crc kubenswrapper[4835]: I1002 11:37:29.777337 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" podStartSLOduration=2.22567704 podStartE2EDuration="3.777316014s" podCreationTimestamp="2025-10-02 11:37:26 +0000 UTC" firstStartedPulling="2025-10-02 11:37:27.631043501 +0000 UTC m=+2524.190951082" lastFinishedPulling="2025-10-02 11:37:29.182682475 +0000 UTC m=+2525.742590056" observedRunningTime="2025-10-02 11:37:29.767249265 +0000 UTC m=+2526.327156916" watchObservedRunningTime="2025-10-02 11:37:29.777316014 +0000 UTC m=+2526.337223595" Oct 02 11:37:39 crc kubenswrapper[4835]: I1002 11:37:39.251726 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:37:39 crc kubenswrapper[4835]: E1002 11:37:39.253490 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:37:54 crc kubenswrapper[4835]: I1002 11:37:54.261732 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:37:54 crc kubenswrapper[4835]: E1002 11:37:54.262892 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:37:59 crc kubenswrapper[4835]: I1002 11:37:59.740047 4835 scope.go:117] "RemoveContainer" containerID="1d416150be87fdd0a12ed6c32b4282fc7b11efa664b9b78321fb2776a0e4c168" Oct 02 11:37:59 crc kubenswrapper[4835]: I1002 11:37:59.768379 4835 scope.go:117] "RemoveContainer" containerID="981c232917c5372f5958617b6368e3333049670c052fcb924c6164ef0d43e8e3" Oct 02 11:37:59 crc kubenswrapper[4835]: I1002 11:37:59.819352 4835 scope.go:117] "RemoveContainer" containerID="15bf4dcf92ecb00604d3a7140bde3fe963a929af47fbdf7727fa8553224814e4" Oct 02 11:38:07 crc kubenswrapper[4835]: I1002 11:38:07.251663 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:38:07 crc kubenswrapper[4835]: E1002 11:38:07.252559 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:38:21 crc kubenswrapper[4835]: I1002 11:38:21.251296 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:38:21 crc kubenswrapper[4835]: E1002 11:38:21.252086 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:38:34 crc kubenswrapper[4835]: I1002 11:38:34.344066 4835 generic.go:334] "Generic (PLEG): container finished" podID="af586fd6-b857-4897-8b19-4d57315fba61" containerID="9d9714ea874cd13d486e4e7506dce7a6fb9cb8d294afe8d07716c88372125889" exitCode=0 Oct 02 11:38:34 crc kubenswrapper[4835]: I1002 11:38:34.344154 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" event={"ID":"af586fd6-b857-4897-8b19-4d57315fba61","Type":"ContainerDied","Data":"9d9714ea874cd13d486e4e7506dce7a6fb9cb8d294afe8d07716c88372125889"} Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.748118 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.894132 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ceph\") pod \"af586fd6-b857-4897-8b19-4d57315fba61\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.894275 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ssh-key\") pod \"af586fd6-b857-4897-8b19-4d57315fba61\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.894311 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af586fd6-b857-4897-8b19-4d57315fba61-ovncontroller-config-0\") pod \"af586fd6-b857-4897-8b19-4d57315fba61\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.894343 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-inventory\") pod \"af586fd6-b857-4897-8b19-4d57315fba61\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.894444 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ovn-combined-ca-bundle\") pod \"af586fd6-b857-4897-8b19-4d57315fba61\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.894510 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zm4n\" (UniqueName: \"kubernetes.io/projected/af586fd6-b857-4897-8b19-4d57315fba61-kube-api-access-2zm4n\") pod \"af586fd6-b857-4897-8b19-4d57315fba61\" (UID: \"af586fd6-b857-4897-8b19-4d57315fba61\") " Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.900405 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "af586fd6-b857-4897-8b19-4d57315fba61" (UID: "af586fd6-b857-4897-8b19-4d57315fba61"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.901138 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ceph" (OuterVolumeSpecName: "ceph") pod "af586fd6-b857-4897-8b19-4d57315fba61" (UID: "af586fd6-b857-4897-8b19-4d57315fba61"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.901646 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af586fd6-b857-4897-8b19-4d57315fba61-kube-api-access-2zm4n" (OuterVolumeSpecName: "kube-api-access-2zm4n") pod "af586fd6-b857-4897-8b19-4d57315fba61" (UID: "af586fd6-b857-4897-8b19-4d57315fba61"). InnerVolumeSpecName "kube-api-access-2zm4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.921305 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-inventory" (OuterVolumeSpecName: "inventory") pod "af586fd6-b857-4897-8b19-4d57315fba61" (UID: "af586fd6-b857-4897-8b19-4d57315fba61"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.922515 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "af586fd6-b857-4897-8b19-4d57315fba61" (UID: "af586fd6-b857-4897-8b19-4d57315fba61"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.928591 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af586fd6-b857-4897-8b19-4d57315fba61-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "af586fd6-b857-4897-8b19-4d57315fba61" (UID: "af586fd6-b857-4897-8b19-4d57315fba61"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.996712 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zm4n\" (UniqueName: \"kubernetes.io/projected/af586fd6-b857-4897-8b19-4d57315fba61-kube-api-access-2zm4n\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.996742 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.996754 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.996763 4835 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af586fd6-b857-4897-8b19-4d57315fba61-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.996774 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:35 crc kubenswrapper[4835]: I1002 11:38:35.996784 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af586fd6-b857-4897-8b19-4d57315fba61-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.252963 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:38:36 crc kubenswrapper[4835]: E1002 11:38:36.253276 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.363302 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" event={"ID":"af586fd6-b857-4897-8b19-4d57315fba61","Type":"ContainerDied","Data":"9a41c6e2b11e769a2c2cfef6c15b6b6ada71722449ce34a8568f9275a1efce0b"} Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.363355 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a41c6e2b11e769a2c2cfef6c15b6b6ada71722449ce34a8568f9275a1efce0b" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.363329 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvqw" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.453358 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42"] Oct 02 11:38:36 crc kubenswrapper[4835]: E1002 11:38:36.453817 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af586fd6-b857-4897-8b19-4d57315fba61" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.453839 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="af586fd6-b857-4897-8b19-4d57315fba61" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.454083 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="af586fd6-b857-4897-8b19-4d57315fba61" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.455933 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.458170 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.458486 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.458638 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.460791 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.460963 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.461101 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.461318 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.473762 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42"] Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.610104 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.610146 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.610176 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.610221 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.610794 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.610854 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.610943 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chhnq\" (UniqueName: \"kubernetes.io/projected/5665327e-d24d-4cd7-908c-fc1fd10204fb-kube-api-access-chhnq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.712540 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.712606 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.712656 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chhnq\" (UniqueName: \"kubernetes.io/projected/5665327e-d24d-4cd7-908c-fc1fd10204fb-kube-api-access-chhnq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.712775 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.712799 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.712835 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.712860 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.716518 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.716909 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.717844 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.718004 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.718502 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.718633 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.728469 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chhnq\" (UniqueName: \"kubernetes.io/projected/5665327e-d24d-4cd7-908c-fc1fd10204fb-kube-api-access-chhnq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:36 crc kubenswrapper[4835]: I1002 11:38:36.781171 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:38:37 crc kubenswrapper[4835]: I1002 11:38:37.295123 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42"] Oct 02 11:38:37 crc kubenswrapper[4835]: I1002 11:38:37.372354 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" event={"ID":"5665327e-d24d-4cd7-908c-fc1fd10204fb","Type":"ContainerStarted","Data":"c93dac2ce437b38fe0d9f0ec5ddcc0c29dc180ede74f9c9841bcb791b077c32a"} Oct 02 11:38:38 crc kubenswrapper[4835]: I1002 11:38:38.383278 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" event={"ID":"5665327e-d24d-4cd7-908c-fc1fd10204fb","Type":"ContainerStarted","Data":"528edef29a60280838885754ee6009caa30cef758bb891a71080a261c034a716"} Oct 02 11:38:38 crc kubenswrapper[4835]: I1002 11:38:38.398544 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" podStartSLOduration=1.843953908 podStartE2EDuration="2.398524239s" podCreationTimestamp="2025-10-02 11:38:36 +0000 UTC" firstStartedPulling="2025-10-02 11:38:37.30246076 +0000 UTC m=+2593.862368341" lastFinishedPulling="2025-10-02 11:38:37.857031091 +0000 UTC m=+2594.416938672" observedRunningTime="2025-10-02 11:38:38.397951223 +0000 UTC m=+2594.957858804" watchObservedRunningTime="2025-10-02 11:38:38.398524239 +0000 UTC m=+2594.958431820" Oct 02 11:38:50 crc kubenswrapper[4835]: I1002 11:38:50.252544 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:38:50 crc kubenswrapper[4835]: E1002 11:38:50.253472 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:39:05 crc kubenswrapper[4835]: I1002 11:39:05.251426 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:39:05 crc kubenswrapper[4835]: E1002 11:39:05.252191 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:39:16 crc kubenswrapper[4835]: I1002 11:39:16.251883 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:39:16 crc kubenswrapper[4835]: E1002 11:39:16.253604 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:39:28 crc kubenswrapper[4835]: I1002 11:39:28.252153 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:39:28 crc kubenswrapper[4835]: E1002 11:39:28.253852 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:39:32 crc kubenswrapper[4835]: I1002 11:39:32.869395 4835 generic.go:334] "Generic (PLEG): container finished" podID="5665327e-d24d-4cd7-908c-fc1fd10204fb" containerID="528edef29a60280838885754ee6009caa30cef758bb891a71080a261c034a716" exitCode=0 Oct 02 11:39:32 crc kubenswrapper[4835]: I1002 11:39:32.869497 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" event={"ID":"5665327e-d24d-4cd7-908c-fc1fd10204fb","Type":"ContainerDied","Data":"528edef29a60280838885754ee6009caa30cef758bb891a71080a261c034a716"} Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.265186 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.285660 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-inventory\") pod \"5665327e-d24d-4cd7-908c-fc1fd10204fb\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.285717 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chhnq\" (UniqueName: \"kubernetes.io/projected/5665327e-d24d-4cd7-908c-fc1fd10204fb-kube-api-access-chhnq\") pod \"5665327e-d24d-4cd7-908c-fc1fd10204fb\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.285763 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-nova-metadata-neutron-config-0\") pod \"5665327e-d24d-4cd7-908c-fc1fd10204fb\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.285785 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ceph\") pod \"5665327e-d24d-4cd7-908c-fc1fd10204fb\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.285829 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-metadata-combined-ca-bundle\") pod \"5665327e-d24d-4cd7-908c-fc1fd10204fb\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.285864 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"5665327e-d24d-4cd7-908c-fc1fd10204fb\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.285966 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ssh-key\") pod \"5665327e-d24d-4cd7-908c-fc1fd10204fb\" (UID: \"5665327e-d24d-4cd7-908c-fc1fd10204fb\") " Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.291380 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ceph" (OuterVolumeSpecName: "ceph") pod "5665327e-d24d-4cd7-908c-fc1fd10204fb" (UID: "5665327e-d24d-4cd7-908c-fc1fd10204fb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.291591 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5665327e-d24d-4cd7-908c-fc1fd10204fb-kube-api-access-chhnq" (OuterVolumeSpecName: "kube-api-access-chhnq") pod "5665327e-d24d-4cd7-908c-fc1fd10204fb" (UID: "5665327e-d24d-4cd7-908c-fc1fd10204fb"). InnerVolumeSpecName "kube-api-access-chhnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.292929 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5665327e-d24d-4cd7-908c-fc1fd10204fb" (UID: "5665327e-d24d-4cd7-908c-fc1fd10204fb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.310003 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "5665327e-d24d-4cd7-908c-fc1fd10204fb" (UID: "5665327e-d24d-4cd7-908c-fc1fd10204fb"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.312952 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5665327e-d24d-4cd7-908c-fc1fd10204fb" (UID: "5665327e-d24d-4cd7-908c-fc1fd10204fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.324313 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-inventory" (OuterVolumeSpecName: "inventory") pod "5665327e-d24d-4cd7-908c-fc1fd10204fb" (UID: "5665327e-d24d-4cd7-908c-fc1fd10204fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.332752 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "5665327e-d24d-4cd7-908c-fc1fd10204fb" (UID: "5665327e-d24d-4cd7-908c-fc1fd10204fb"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.388450 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.388492 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.388505 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chhnq\" (UniqueName: \"kubernetes.io/projected/5665327e-d24d-4cd7-908c-fc1fd10204fb-kube-api-access-chhnq\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.388519 4835 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.388532 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.388546 4835 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.388560 4835 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5665327e-d24d-4cd7-908c-fc1fd10204fb-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.896952 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" event={"ID":"5665327e-d24d-4cd7-908c-fc1fd10204fb","Type":"ContainerDied","Data":"c93dac2ce437b38fe0d9f0ec5ddcc0c29dc180ede74f9c9841bcb791b077c32a"} Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.897011 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c93dac2ce437b38fe0d9f0ec5ddcc0c29dc180ede74f9c9841bcb791b077c32a" Oct 02 11:39:34 crc kubenswrapper[4835]: I1002 11:39:34.897086 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.011816 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz"] Oct 02 11:39:35 crc kubenswrapper[4835]: E1002 11:39:35.012249 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5665327e-d24d-4cd7-908c-fc1fd10204fb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.012267 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5665327e-d24d-4cd7-908c-fc1fd10204fb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.012507 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5665327e-d24d-4cd7-908c-fc1fd10204fb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.013247 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.018916 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.019055 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.019127 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.019301 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.019399 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.019432 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.023352 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz"] Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.202272 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.202386 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.202413 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.203277 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.203451 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.203498 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28v8b\" (UniqueName: \"kubernetes.io/projected/9ef15b26-b414-4846-b8e6-6846f04d18c5-kube-api-access-28v8b\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.304730 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.305239 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.305298 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.305396 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.305483 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.305516 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28v8b\" (UniqueName: \"kubernetes.io/projected/9ef15b26-b414-4846-b8e6-6846f04d18c5-kube-api-access-28v8b\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.309493 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.309874 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.309954 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.311023 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.311467 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.324784 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28v8b\" (UniqueName: \"kubernetes.io/projected/9ef15b26-b414-4846-b8e6-6846f04d18c5-kube-api-access-28v8b\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.343143 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:39:35 crc kubenswrapper[4835]: I1002 11:39:35.888428 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz"] Oct 02 11:39:36 crc kubenswrapper[4835]: I1002 11:39:36.925038 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" event={"ID":"9ef15b26-b414-4846-b8e6-6846f04d18c5","Type":"ContainerStarted","Data":"fbd08b2893db830875e1e7c980d580d16897de4e965f985e9fc502d621fc7e1c"} Oct 02 11:39:36 crc kubenswrapper[4835]: I1002 11:39:36.925625 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" event={"ID":"9ef15b26-b414-4846-b8e6-6846f04d18c5","Type":"ContainerStarted","Data":"f729650592df1d1579e69eed8f0e6d42452d805f0e41ac806b50e649c40cdb1c"} Oct 02 11:39:36 crc kubenswrapper[4835]: I1002 11:39:36.943358 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" podStartSLOduration=2.492424456 podStartE2EDuration="2.943338046s" podCreationTimestamp="2025-10-02 11:39:34 +0000 UTC" firstStartedPulling="2025-10-02 11:39:35.918868744 +0000 UTC m=+2652.478776325" lastFinishedPulling="2025-10-02 11:39:36.369782294 +0000 UTC m=+2652.929689915" observedRunningTime="2025-10-02 11:39:36.942912524 +0000 UTC m=+2653.502820115" watchObservedRunningTime="2025-10-02 11:39:36.943338046 +0000 UTC m=+2653.503245617" Oct 02 11:39:42 crc kubenswrapper[4835]: I1002 11:39:42.251846 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:39:42 crc kubenswrapper[4835]: E1002 11:39:42.253473 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:39:53 crc kubenswrapper[4835]: I1002 11:39:53.253005 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:39:53 crc kubenswrapper[4835]: E1002 11:39:53.254021 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:40:05 crc kubenswrapper[4835]: I1002 11:40:05.252031 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:40:05 crc kubenswrapper[4835]: E1002 11:40:05.252824 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:40:16 crc kubenswrapper[4835]: I1002 11:40:16.251953 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:40:16 crc kubenswrapper[4835]: E1002 11:40:16.252951 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:40:31 crc kubenswrapper[4835]: I1002 11:40:31.251940 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:40:31 crc kubenswrapper[4835]: E1002 11:40:31.252628 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:40:44 crc kubenswrapper[4835]: I1002 11:40:44.258827 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:40:45 crc kubenswrapper[4835]: I1002 11:40:45.502576 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"0244201f52c8dd5e3922b1708549b362fa6db05645f36e77deac378419afe8ad"} Oct 02 11:43:11 crc kubenswrapper[4835]: I1002 11:43:11.983929 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:43:11 crc kubenswrapper[4835]: I1002 11:43:11.984618 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:43:31 crc kubenswrapper[4835]: I1002 11:43:31.934498 4835 generic.go:334] "Generic (PLEG): container finished" podID="9ef15b26-b414-4846-b8e6-6846f04d18c5" containerID="fbd08b2893db830875e1e7c980d580d16897de4e965f985e9fc502d621fc7e1c" exitCode=0 Oct 02 11:43:31 crc kubenswrapper[4835]: I1002 11:43:31.934570 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" event={"ID":"9ef15b26-b414-4846-b8e6-6846f04d18c5","Type":"ContainerDied","Data":"fbd08b2893db830875e1e7c980d580d16897de4e965f985e9fc502d621fc7e1c"} Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.360517 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.523539 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-inventory\") pod \"9ef15b26-b414-4846-b8e6-6846f04d18c5\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.523601 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ceph\") pod \"9ef15b26-b414-4846-b8e6-6846f04d18c5\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.523650 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-combined-ca-bundle\") pod \"9ef15b26-b414-4846-b8e6-6846f04d18c5\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.523692 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28v8b\" (UniqueName: \"kubernetes.io/projected/9ef15b26-b414-4846-b8e6-6846f04d18c5-kube-api-access-28v8b\") pod \"9ef15b26-b414-4846-b8e6-6846f04d18c5\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.523828 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-secret-0\") pod \"9ef15b26-b414-4846-b8e6-6846f04d18c5\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.523883 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ssh-key\") pod \"9ef15b26-b414-4846-b8e6-6846f04d18c5\" (UID: \"9ef15b26-b414-4846-b8e6-6846f04d18c5\") " Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.532132 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ceph" (OuterVolumeSpecName: "ceph") pod "9ef15b26-b414-4846-b8e6-6846f04d18c5" (UID: "9ef15b26-b414-4846-b8e6-6846f04d18c5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.532158 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9ef15b26-b414-4846-b8e6-6846f04d18c5" (UID: "9ef15b26-b414-4846-b8e6-6846f04d18c5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.533257 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef15b26-b414-4846-b8e6-6846f04d18c5-kube-api-access-28v8b" (OuterVolumeSpecName: "kube-api-access-28v8b") pod "9ef15b26-b414-4846-b8e6-6846f04d18c5" (UID: "9ef15b26-b414-4846-b8e6-6846f04d18c5"). InnerVolumeSpecName "kube-api-access-28v8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.557674 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-inventory" (OuterVolumeSpecName: "inventory") pod "9ef15b26-b414-4846-b8e6-6846f04d18c5" (UID: "9ef15b26-b414-4846-b8e6-6846f04d18c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.558294 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9ef15b26-b414-4846-b8e6-6846f04d18c5" (UID: "9ef15b26-b414-4846-b8e6-6846f04d18c5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.561557 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "9ef15b26-b414-4846-b8e6-6846f04d18c5" (UID: "9ef15b26-b414-4846-b8e6-6846f04d18c5"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.626027 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.626058 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.626069 4835 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.626082 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28v8b\" (UniqueName: \"kubernetes.io/projected/9ef15b26-b414-4846-b8e6-6846f04d18c5-kube-api-access-28v8b\") on node \"crc\" DevicePath \"\"" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.626091 4835 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.626100 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef15b26-b414-4846-b8e6-6846f04d18c5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.958606 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" event={"ID":"9ef15b26-b414-4846-b8e6-6846f04d18c5","Type":"ContainerDied","Data":"f729650592df1d1579e69eed8f0e6d42452d805f0e41ac806b50e649c40cdb1c"} Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.958662 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f729650592df1d1579e69eed8f0e6d42452d805f0e41ac806b50e649c40cdb1c" Oct 02 11:43:33 crc kubenswrapper[4835]: I1002 11:43:33.958673 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.049325 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d"] Oct 02 11:43:34 crc kubenswrapper[4835]: E1002 11:43:34.049689 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef15b26-b414-4846-b8e6-6846f04d18c5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.049706 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef15b26-b414-4846-b8e6-6846f04d18c5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.049865 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef15b26-b414-4846-b8e6-6846f04d18c5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.050434 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.056568 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.056763 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.057291 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hcnzr" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.057504 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.057538 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.057635 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.059542 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.059704 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.059850 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.061236 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d"] Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.133450 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.133520 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.133550 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.133676 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.133717 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.133778 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.133816 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.133899 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.133943 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.133978 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.134015 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfd9w\" (UniqueName: \"kubernetes.io/projected/758c6988-399c-4303-a629-876f1234d88e-kube-api-access-xfd9w\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.235381 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.235430 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.235454 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.235476 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.235516 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.236003 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.236052 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.236085 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.236113 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.236143 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfd9w\" (UniqueName: \"kubernetes.io/projected/758c6988-399c-4303-a629-876f1234d88e-kube-api-access-xfd9w\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.236193 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.236670 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.237376 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.239636 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.239659 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.240091 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.240662 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.240798 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.241258 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.243904 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.243950 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.254379 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfd9w\" (UniqueName: \"kubernetes.io/projected/758c6988-399c-4303-a629-876f1234d88e-kube-api-access-xfd9w\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.378461 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.951675 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d"] Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.954434 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:43:34 crc kubenswrapper[4835]: I1002 11:43:34.967838 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" event={"ID":"758c6988-399c-4303-a629-876f1234d88e","Type":"ContainerStarted","Data":"12abd15663c77b0af21f700f941d050851396bd1286851a58723332ba8853e63"} Oct 02 11:43:35 crc kubenswrapper[4835]: I1002 11:43:35.982661 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" event={"ID":"758c6988-399c-4303-a629-876f1234d88e","Type":"ContainerStarted","Data":"360da1d520d403e1d8ca5acac65c6f0ca15b02660d5f3e6655b54c42e53e8add"} Oct 02 11:43:36 crc kubenswrapper[4835]: I1002 11:43:36.003134 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" podStartSLOduration=1.489919364 podStartE2EDuration="2.003056475s" podCreationTimestamp="2025-10-02 11:43:34 +0000 UTC" firstStartedPulling="2025-10-02 11:43:34.954150918 +0000 UTC m=+2891.514058509" lastFinishedPulling="2025-10-02 11:43:35.467288039 +0000 UTC m=+2892.027195620" observedRunningTime="2025-10-02 11:43:35.997333262 +0000 UTC m=+2892.557240843" watchObservedRunningTime="2025-10-02 11:43:36.003056475 +0000 UTC m=+2892.562964056" Oct 02 11:43:41 crc kubenswrapper[4835]: I1002 11:43:41.984297 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:43:41 crc kubenswrapper[4835]: I1002 11:43:41.984904 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:44:11 crc kubenswrapper[4835]: I1002 11:44:11.984323 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:44:11 crc kubenswrapper[4835]: I1002 11:44:11.984991 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:44:11 crc kubenswrapper[4835]: I1002 11:44:11.985051 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:44:11 crc kubenswrapper[4835]: I1002 11:44:11.986043 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0244201f52c8dd5e3922b1708549b362fa6db05645f36e77deac378419afe8ad"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:44:11 crc kubenswrapper[4835]: I1002 11:44:11.986112 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://0244201f52c8dd5e3922b1708549b362fa6db05645f36e77deac378419afe8ad" gracePeriod=600 Oct 02 11:44:12 crc kubenswrapper[4835]: I1002 11:44:12.296174 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="0244201f52c8dd5e3922b1708549b362fa6db05645f36e77deac378419afe8ad" exitCode=0 Oct 02 11:44:12 crc kubenswrapper[4835]: I1002 11:44:12.296256 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"0244201f52c8dd5e3922b1708549b362fa6db05645f36e77deac378419afe8ad"} Oct 02 11:44:12 crc kubenswrapper[4835]: I1002 11:44:12.296681 4835 scope.go:117] "RemoveContainer" containerID="f4e753f7b8f020fe7dc319bb75d9fa9058a8813d06935ef737b89397ac6924a3" Oct 02 11:44:13 crc kubenswrapper[4835]: I1002 11:44:13.307371 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe"} Oct 02 11:44:31 crc kubenswrapper[4835]: I1002 11:44:31.997085 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zt4w5"] Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.000261 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.019439 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zt4w5"] Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.097370 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scvlf\" (UniqueName: \"kubernetes.io/projected/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-kube-api-access-scvlf\") pod \"redhat-operators-zt4w5\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.097457 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-catalog-content\") pod \"redhat-operators-zt4w5\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.097607 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-utilities\") pod \"redhat-operators-zt4w5\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.199461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-catalog-content\") pod \"redhat-operators-zt4w5\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.199574 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-utilities\") pod \"redhat-operators-zt4w5\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.199664 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scvlf\" (UniqueName: \"kubernetes.io/projected/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-kube-api-access-scvlf\") pod \"redhat-operators-zt4w5\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.200025 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-catalog-content\") pod \"redhat-operators-zt4w5\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.200047 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-utilities\") pod \"redhat-operators-zt4w5\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.225850 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scvlf\" (UniqueName: \"kubernetes.io/projected/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-kube-api-access-scvlf\") pod \"redhat-operators-zt4w5\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.325940 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:32 crc kubenswrapper[4835]: I1002 11:44:32.767429 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zt4w5"] Oct 02 11:44:32 crc kubenswrapper[4835]: W1002 11:44:32.776396 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18e8e056_79bc_47c1_b6b9_5cb6ecd8e694.slice/crio-b6d4ea33b48aeb366cf0338bfb4298f2782376dffcdc6f315a2235ac13e31091 WatchSource:0}: Error finding container b6d4ea33b48aeb366cf0338bfb4298f2782376dffcdc6f315a2235ac13e31091: Status 404 returned error can't find the container with id b6d4ea33b48aeb366cf0338bfb4298f2782376dffcdc6f315a2235ac13e31091 Oct 02 11:44:33 crc kubenswrapper[4835]: I1002 11:44:33.466598 4835 generic.go:334] "Generic (PLEG): container finished" podID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" containerID="2393e2c15913194ca88dcf1fef02778fc43cfa3846a70f333962f857827b03c3" exitCode=0 Oct 02 11:44:33 crc kubenswrapper[4835]: I1002 11:44:33.466705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt4w5" event={"ID":"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694","Type":"ContainerDied","Data":"2393e2c15913194ca88dcf1fef02778fc43cfa3846a70f333962f857827b03c3"} Oct 02 11:44:33 crc kubenswrapper[4835]: I1002 11:44:33.466927 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt4w5" event={"ID":"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694","Type":"ContainerStarted","Data":"b6d4ea33b48aeb366cf0338bfb4298f2782376dffcdc6f315a2235ac13e31091"} Oct 02 11:44:35 crc kubenswrapper[4835]: I1002 11:44:35.485115 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt4w5" event={"ID":"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694","Type":"ContainerStarted","Data":"798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b"} Oct 02 11:44:36 crc kubenswrapper[4835]: I1002 11:44:36.495132 4835 generic.go:334] "Generic (PLEG): container finished" podID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" containerID="798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b" exitCode=0 Oct 02 11:44:36 crc kubenswrapper[4835]: I1002 11:44:36.495252 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt4w5" event={"ID":"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694","Type":"ContainerDied","Data":"798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b"} Oct 02 11:44:37 crc kubenswrapper[4835]: I1002 11:44:37.505852 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt4w5" event={"ID":"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694","Type":"ContainerStarted","Data":"6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8"} Oct 02 11:44:38 crc kubenswrapper[4835]: I1002 11:44:38.537128 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zt4w5" podStartSLOduration=3.754653999 podStartE2EDuration="7.537110954s" podCreationTimestamp="2025-10-02 11:44:31 +0000 UTC" firstStartedPulling="2025-10-02 11:44:33.468134675 +0000 UTC m=+2950.028042256" lastFinishedPulling="2025-10-02 11:44:37.25059164 +0000 UTC m=+2953.810499211" observedRunningTime="2025-10-02 11:44:38.534839659 +0000 UTC m=+2955.094747240" watchObservedRunningTime="2025-10-02 11:44:38.537110954 +0000 UTC m=+2955.097018535" Oct 02 11:44:42 crc kubenswrapper[4835]: I1002 11:44:42.327049 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:42 crc kubenswrapper[4835]: I1002 11:44:42.327415 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:42 crc kubenswrapper[4835]: I1002 11:44:42.374091 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:42 crc kubenswrapper[4835]: I1002 11:44:42.589142 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:42 crc kubenswrapper[4835]: I1002 11:44:42.668423 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zt4w5"] Oct 02 11:44:44 crc kubenswrapper[4835]: I1002 11:44:44.563903 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zt4w5" podUID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" containerName="registry-server" containerID="cri-o://6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8" gracePeriod=2 Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.072054 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.149131 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-catalog-content\") pod \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.149328 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-utilities\") pod \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.149518 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scvlf\" (UniqueName: \"kubernetes.io/projected/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-kube-api-access-scvlf\") pod \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\" (UID: \"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694\") " Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.150181 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-utilities" (OuterVolumeSpecName: "utilities") pod "18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" (UID: "18e8e056-79bc-47c1-b6b9-5cb6ecd8e694"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.154988 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-kube-api-access-scvlf" (OuterVolumeSpecName: "kube-api-access-scvlf") pod "18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" (UID: "18e8e056-79bc-47c1-b6b9-5cb6ecd8e694"). InnerVolumeSpecName "kube-api-access-scvlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.252051 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.252247 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scvlf\" (UniqueName: \"kubernetes.io/projected/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-kube-api-access-scvlf\") on node \"crc\" DevicePath \"\"" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.260967 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" (UID: "18e8e056-79bc-47c1-b6b9-5cb6ecd8e694"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.354515 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.574314 4835 generic.go:334] "Generic (PLEG): container finished" podID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" containerID="6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8" exitCode=0 Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.574356 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt4w5" event={"ID":"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694","Type":"ContainerDied","Data":"6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8"} Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.574404 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt4w5" event={"ID":"18e8e056-79bc-47c1-b6b9-5cb6ecd8e694","Type":"ContainerDied","Data":"b6d4ea33b48aeb366cf0338bfb4298f2782376dffcdc6f315a2235ac13e31091"} Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.574423 4835 scope.go:117] "RemoveContainer" containerID="6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.574417 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zt4w5" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.601631 4835 scope.go:117] "RemoveContainer" containerID="798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.624348 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zt4w5"] Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.632476 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zt4w5"] Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.635817 4835 scope.go:117] "RemoveContainer" containerID="2393e2c15913194ca88dcf1fef02778fc43cfa3846a70f333962f857827b03c3" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.686731 4835 scope.go:117] "RemoveContainer" containerID="6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8" Oct 02 11:44:45 crc kubenswrapper[4835]: E1002 11:44:45.687332 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8\": container with ID starting with 6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8 not found: ID does not exist" containerID="6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.687401 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8"} err="failed to get container status \"6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8\": rpc error: code = NotFound desc = could not find container \"6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8\": container with ID starting with 6eac8c3cfdc610a9966861a89e7a03ddc60d6a6cd7c508f43ad19cfa12311ce8 not found: ID does not exist" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.687476 4835 scope.go:117] "RemoveContainer" containerID="798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b" Oct 02 11:44:45 crc kubenswrapper[4835]: E1002 11:44:45.688424 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b\": container with ID starting with 798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b not found: ID does not exist" containerID="798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.688454 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b"} err="failed to get container status \"798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b\": rpc error: code = NotFound desc = could not find container \"798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b\": container with ID starting with 798fa33911c2b9c172d9d4446e19886112642b39356438ed70f3554f1664093b not found: ID does not exist" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.688474 4835 scope.go:117] "RemoveContainer" containerID="2393e2c15913194ca88dcf1fef02778fc43cfa3846a70f333962f857827b03c3" Oct 02 11:44:45 crc kubenswrapper[4835]: E1002 11:44:45.688960 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2393e2c15913194ca88dcf1fef02778fc43cfa3846a70f333962f857827b03c3\": container with ID starting with 2393e2c15913194ca88dcf1fef02778fc43cfa3846a70f333962f857827b03c3 not found: ID does not exist" containerID="2393e2c15913194ca88dcf1fef02778fc43cfa3846a70f333962f857827b03c3" Oct 02 11:44:45 crc kubenswrapper[4835]: I1002 11:44:45.689154 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2393e2c15913194ca88dcf1fef02778fc43cfa3846a70f333962f857827b03c3"} err="failed to get container status \"2393e2c15913194ca88dcf1fef02778fc43cfa3846a70f333962f857827b03c3\": rpc error: code = NotFound desc = could not find container \"2393e2c15913194ca88dcf1fef02778fc43cfa3846a70f333962f857827b03c3\": container with ID starting with 2393e2c15913194ca88dcf1fef02778fc43cfa3846a70f333962f857827b03c3 not found: ID does not exist" Oct 02 11:44:46 crc kubenswrapper[4835]: I1002 11:44:46.264360 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" path="/var/lib/kubelet/pods/18e8e056-79bc-47c1-b6b9-5cb6ecd8e694/volumes" Oct 02 11:44:49 crc kubenswrapper[4835]: I1002 11:44:49.868910 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w262j"] Oct 02 11:44:49 crc kubenswrapper[4835]: E1002 11:44:49.869621 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" containerName="extract-utilities" Oct 02 11:44:49 crc kubenswrapper[4835]: I1002 11:44:49.869638 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" containerName="extract-utilities" Oct 02 11:44:49 crc kubenswrapper[4835]: E1002 11:44:49.869660 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" containerName="extract-content" Oct 02 11:44:49 crc kubenswrapper[4835]: I1002 11:44:49.869667 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" containerName="extract-content" Oct 02 11:44:49 crc kubenswrapper[4835]: E1002 11:44:49.869688 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" containerName="registry-server" Oct 02 11:44:49 crc kubenswrapper[4835]: I1002 11:44:49.869716 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" containerName="registry-server" Oct 02 11:44:49 crc kubenswrapper[4835]: I1002 11:44:49.869912 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e8e056-79bc-47c1-b6b9-5cb6ecd8e694" containerName="registry-server" Oct 02 11:44:49 crc kubenswrapper[4835]: I1002 11:44:49.871286 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:44:49 crc kubenswrapper[4835]: I1002 11:44:49.887889 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w262j"] Oct 02 11:44:49 crc kubenswrapper[4835]: I1002 11:44:49.938804 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-utilities\") pod \"certified-operators-w262j\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:44:49 crc kubenswrapper[4835]: I1002 11:44:49.939178 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-catalog-content\") pod \"certified-operators-w262j\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:44:49 crc kubenswrapper[4835]: I1002 11:44:49.939603 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gnwv\" (UniqueName: \"kubernetes.io/projected/21c44a49-de8a-4b00-a7d0-2026b634b58c-kube-api-access-6gnwv\") pod \"certified-operators-w262j\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:44:50 crc kubenswrapper[4835]: I1002 11:44:50.040891 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-utilities\") pod \"certified-operators-w262j\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:44:50 crc kubenswrapper[4835]: I1002 11:44:50.041008 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-catalog-content\") pod \"certified-operators-w262j\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:44:50 crc kubenswrapper[4835]: I1002 11:44:50.041086 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gnwv\" (UniqueName: \"kubernetes.io/projected/21c44a49-de8a-4b00-a7d0-2026b634b58c-kube-api-access-6gnwv\") pod \"certified-operators-w262j\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:44:50 crc kubenswrapper[4835]: I1002 11:44:50.041658 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-catalog-content\") pod \"certified-operators-w262j\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:44:50 crc kubenswrapper[4835]: I1002 11:44:50.041713 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-utilities\") pod \"certified-operators-w262j\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:44:50 crc kubenswrapper[4835]: I1002 11:44:50.061268 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gnwv\" (UniqueName: \"kubernetes.io/projected/21c44a49-de8a-4b00-a7d0-2026b634b58c-kube-api-access-6gnwv\") pod \"certified-operators-w262j\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:44:50 crc kubenswrapper[4835]: I1002 11:44:50.192822 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:44:50 crc kubenswrapper[4835]: I1002 11:44:50.703132 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w262j"] Oct 02 11:44:51 crc kubenswrapper[4835]: I1002 11:44:51.624867 4835 generic.go:334] "Generic (PLEG): container finished" podID="21c44a49-de8a-4b00-a7d0-2026b634b58c" containerID="4f1029184f2802b982901efecb94360e891d14fcbe79ca9b77151c0c4c0eff62" exitCode=0 Oct 02 11:44:51 crc kubenswrapper[4835]: I1002 11:44:51.624933 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w262j" event={"ID":"21c44a49-de8a-4b00-a7d0-2026b634b58c","Type":"ContainerDied","Data":"4f1029184f2802b982901efecb94360e891d14fcbe79ca9b77151c0c4c0eff62"} Oct 02 11:44:51 crc kubenswrapper[4835]: I1002 11:44:51.626418 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w262j" event={"ID":"21c44a49-de8a-4b00-a7d0-2026b634b58c","Type":"ContainerStarted","Data":"cd9ec978b7024a83850e080619f19fc1716450c99c02b7d51700e8f9ba7cdeb1"} Oct 02 11:44:53 crc kubenswrapper[4835]: E1002 11:44:53.319435 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c44a49_de8a_4b00_a7d0_2026b634b58c.slice/crio-conmon-fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:44:53 crc kubenswrapper[4835]: I1002 11:44:53.646446 4835 generic.go:334] "Generic (PLEG): container finished" podID="21c44a49-de8a-4b00-a7d0-2026b634b58c" containerID="fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db" exitCode=0 Oct 02 11:44:53 crc kubenswrapper[4835]: I1002 11:44:53.646510 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w262j" event={"ID":"21c44a49-de8a-4b00-a7d0-2026b634b58c","Type":"ContainerDied","Data":"fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db"} Oct 02 11:44:54 crc kubenswrapper[4835]: I1002 11:44:54.657322 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w262j" event={"ID":"21c44a49-de8a-4b00-a7d0-2026b634b58c","Type":"ContainerStarted","Data":"c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07"} Oct 02 11:44:54 crc kubenswrapper[4835]: I1002 11:44:54.678005 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w262j" podStartSLOduration=3.137241485 podStartE2EDuration="5.677986812s" podCreationTimestamp="2025-10-02 11:44:49 +0000 UTC" firstStartedPulling="2025-10-02 11:44:51.629456545 +0000 UTC m=+2968.189364126" lastFinishedPulling="2025-10-02 11:44:54.170201872 +0000 UTC m=+2970.730109453" observedRunningTime="2025-10-02 11:44:54.673768682 +0000 UTC m=+2971.233676273" watchObservedRunningTime="2025-10-02 11:44:54.677986812 +0000 UTC m=+2971.237894393" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.185477 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4"] Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.188726 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.192424 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.192582 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.193608 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.193772 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.195106 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4"] Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.238842 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z5pl\" (UniqueName: \"kubernetes.io/projected/491d8503-b7e0-41af-9fce-7ea8f1344ff3-kube-api-access-8z5pl\") pod \"collect-profiles-29323425-jdvr4\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.238952 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/491d8503-b7e0-41af-9fce-7ea8f1344ff3-secret-volume\") pod \"collect-profiles-29323425-jdvr4\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.238986 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/491d8503-b7e0-41af-9fce-7ea8f1344ff3-config-volume\") pod \"collect-profiles-29323425-jdvr4\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.249085 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.341422 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z5pl\" (UniqueName: \"kubernetes.io/projected/491d8503-b7e0-41af-9fce-7ea8f1344ff3-kube-api-access-8z5pl\") pod \"collect-profiles-29323425-jdvr4\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.341898 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/491d8503-b7e0-41af-9fce-7ea8f1344ff3-secret-volume\") pod \"collect-profiles-29323425-jdvr4\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.341935 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/491d8503-b7e0-41af-9fce-7ea8f1344ff3-config-volume\") pod \"collect-profiles-29323425-jdvr4\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.343025 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/491d8503-b7e0-41af-9fce-7ea8f1344ff3-config-volume\") pod \"collect-profiles-29323425-jdvr4\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.358008 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/491d8503-b7e0-41af-9fce-7ea8f1344ff3-secret-volume\") pod \"collect-profiles-29323425-jdvr4\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.368182 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z5pl\" (UniqueName: \"kubernetes.io/projected/491d8503-b7e0-41af-9fce-7ea8f1344ff3-kube-api-access-8z5pl\") pod \"collect-profiles-29323425-jdvr4\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.517445 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.764819 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:45:00 crc kubenswrapper[4835]: I1002 11:45:00.809685 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w262j"] Oct 02 11:45:01 crc kubenswrapper[4835]: I1002 11:45:01.004387 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4"] Oct 02 11:45:01 crc kubenswrapper[4835]: W1002 11:45:01.015585 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491d8503_b7e0_41af_9fce_7ea8f1344ff3.slice/crio-ba51a2539734a40a7729e8bf7f5ef96e316a3f091267ec2f788b41d6e3ba4b77 WatchSource:0}: Error finding container ba51a2539734a40a7729e8bf7f5ef96e316a3f091267ec2f788b41d6e3ba4b77: Status 404 returned error can't find the container with id ba51a2539734a40a7729e8bf7f5ef96e316a3f091267ec2f788b41d6e3ba4b77 Oct 02 11:45:01 crc kubenswrapper[4835]: I1002 11:45:01.726371 4835 generic.go:334] "Generic (PLEG): container finished" podID="491d8503-b7e0-41af-9fce-7ea8f1344ff3" containerID="e4009de0d456aef2eb5d606411b04231830086bb6a0060a19f11b663a8faa58a" exitCode=0 Oct 02 11:45:01 crc kubenswrapper[4835]: I1002 11:45:01.726422 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" event={"ID":"491d8503-b7e0-41af-9fce-7ea8f1344ff3","Type":"ContainerDied","Data":"e4009de0d456aef2eb5d606411b04231830086bb6a0060a19f11b663a8faa58a"} Oct 02 11:45:01 crc kubenswrapper[4835]: I1002 11:45:01.727992 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" event={"ID":"491d8503-b7e0-41af-9fce-7ea8f1344ff3","Type":"ContainerStarted","Data":"ba51a2539734a40a7729e8bf7f5ef96e316a3f091267ec2f788b41d6e3ba4b77"} Oct 02 11:45:02 crc kubenswrapper[4835]: I1002 11:45:02.736340 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w262j" podUID="21c44a49-de8a-4b00-a7d0-2026b634b58c" containerName="registry-server" containerID="cri-o://c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07" gracePeriod=2 Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.215116 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.222981 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.311285 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/491d8503-b7e0-41af-9fce-7ea8f1344ff3-config-volume\") pod \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.312050 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/491d8503-b7e0-41af-9fce-7ea8f1344ff3-config-volume" (OuterVolumeSpecName: "config-volume") pod "491d8503-b7e0-41af-9fce-7ea8f1344ff3" (UID: "491d8503-b7e0-41af-9fce-7ea8f1344ff3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.312196 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gnwv\" (UniqueName: \"kubernetes.io/projected/21c44a49-de8a-4b00-a7d0-2026b634b58c-kube-api-access-6gnwv\") pod \"21c44a49-de8a-4b00-a7d0-2026b634b58c\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.312849 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-utilities\") pod \"21c44a49-de8a-4b00-a7d0-2026b634b58c\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.312896 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-catalog-content\") pod \"21c44a49-de8a-4b00-a7d0-2026b634b58c\" (UID: \"21c44a49-de8a-4b00-a7d0-2026b634b58c\") " Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.312942 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/491d8503-b7e0-41af-9fce-7ea8f1344ff3-secret-volume\") pod \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.312984 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z5pl\" (UniqueName: \"kubernetes.io/projected/491d8503-b7e0-41af-9fce-7ea8f1344ff3-kube-api-access-8z5pl\") pod \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\" (UID: \"491d8503-b7e0-41af-9fce-7ea8f1344ff3\") " Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.313623 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/491d8503-b7e0-41af-9fce-7ea8f1344ff3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.313643 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-utilities" (OuterVolumeSpecName: "utilities") pod "21c44a49-de8a-4b00-a7d0-2026b634b58c" (UID: "21c44a49-de8a-4b00-a7d0-2026b634b58c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.317993 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491d8503-b7e0-41af-9fce-7ea8f1344ff3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "491d8503-b7e0-41af-9fce-7ea8f1344ff3" (UID: "491d8503-b7e0-41af-9fce-7ea8f1344ff3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.319190 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c44a49-de8a-4b00-a7d0-2026b634b58c-kube-api-access-6gnwv" (OuterVolumeSpecName: "kube-api-access-6gnwv") pod "21c44a49-de8a-4b00-a7d0-2026b634b58c" (UID: "21c44a49-de8a-4b00-a7d0-2026b634b58c"). InnerVolumeSpecName "kube-api-access-6gnwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.327618 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491d8503-b7e0-41af-9fce-7ea8f1344ff3-kube-api-access-8z5pl" (OuterVolumeSpecName: "kube-api-access-8z5pl") pod "491d8503-b7e0-41af-9fce-7ea8f1344ff3" (UID: "491d8503-b7e0-41af-9fce-7ea8f1344ff3"). InnerVolumeSpecName "kube-api-access-8z5pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.376957 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21c44a49-de8a-4b00-a7d0-2026b634b58c" (UID: "21c44a49-de8a-4b00-a7d0-2026b634b58c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.414692 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/491d8503-b7e0-41af-9fce-7ea8f1344ff3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.414722 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z5pl\" (UniqueName: \"kubernetes.io/projected/491d8503-b7e0-41af-9fce-7ea8f1344ff3-kube-api-access-8z5pl\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.414731 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gnwv\" (UniqueName: \"kubernetes.io/projected/21c44a49-de8a-4b00-a7d0-2026b634b58c-kube-api-access-6gnwv\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.414739 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.414746 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21c44a49-de8a-4b00-a7d0-2026b634b58c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.745111 4835 generic.go:334] "Generic (PLEG): container finished" podID="21c44a49-de8a-4b00-a7d0-2026b634b58c" containerID="c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07" exitCode=0 Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.745174 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w262j" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.745243 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w262j" event={"ID":"21c44a49-de8a-4b00-a7d0-2026b634b58c","Type":"ContainerDied","Data":"c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07"} Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.745667 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w262j" event={"ID":"21c44a49-de8a-4b00-a7d0-2026b634b58c","Type":"ContainerDied","Data":"cd9ec978b7024a83850e080619f19fc1716450c99c02b7d51700e8f9ba7cdeb1"} Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.745697 4835 scope.go:117] "RemoveContainer" containerID="c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.747064 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" event={"ID":"491d8503-b7e0-41af-9fce-7ea8f1344ff3","Type":"ContainerDied","Data":"ba51a2539734a40a7729e8bf7f5ef96e316a3f091267ec2f788b41d6e3ba4b77"} Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.747091 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba51a2539734a40a7729e8bf7f5ef96e316a3f091267ec2f788b41d6e3ba4b77" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.747140 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.766313 4835 scope.go:117] "RemoveContainer" containerID="fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.784485 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w262j"] Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.788322 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w262j"] Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.803734 4835 scope.go:117] "RemoveContainer" containerID="4f1029184f2802b982901efecb94360e891d14fcbe79ca9b77151c0c4c0eff62" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.820398 4835 scope.go:117] "RemoveContainer" containerID="c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07" Oct 02 11:45:03 crc kubenswrapper[4835]: E1002 11:45:03.820790 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07\": container with ID starting with c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07 not found: ID does not exist" containerID="c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.820827 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07"} err="failed to get container status \"c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07\": rpc error: code = NotFound desc = could not find container \"c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07\": container with ID starting with c69df297d9ff4b62ac34c771629c8478661127c8dc6d54c59508877ad8fbee07 not found: ID does not exist" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.820852 4835 scope.go:117] "RemoveContainer" containerID="fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db" Oct 02 11:45:03 crc kubenswrapper[4835]: E1002 11:45:03.821127 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db\": container with ID starting with fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db not found: ID does not exist" containerID="fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.821164 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db"} err="failed to get container status \"fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db\": rpc error: code = NotFound desc = could not find container \"fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db\": container with ID starting with fb168f40316400dc6e415b2dbb18b8063997956c2256b7cf891a4c909ae618db not found: ID does not exist" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.821204 4835 scope.go:117] "RemoveContainer" containerID="4f1029184f2802b982901efecb94360e891d14fcbe79ca9b77151c0c4c0eff62" Oct 02 11:45:03 crc kubenswrapper[4835]: E1002 11:45:03.821510 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f1029184f2802b982901efecb94360e891d14fcbe79ca9b77151c0c4c0eff62\": container with ID starting with 4f1029184f2802b982901efecb94360e891d14fcbe79ca9b77151c0c4c0eff62 not found: ID does not exist" containerID="4f1029184f2802b982901efecb94360e891d14fcbe79ca9b77151c0c4c0eff62" Oct 02 11:45:03 crc kubenswrapper[4835]: I1002 11:45:03.821537 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1029184f2802b982901efecb94360e891d14fcbe79ca9b77151c0c4c0eff62"} err="failed to get container status \"4f1029184f2802b982901efecb94360e891d14fcbe79ca9b77151c0c4c0eff62\": rpc error: code = NotFound desc = could not find container \"4f1029184f2802b982901efecb94360e891d14fcbe79ca9b77151c0c4c0eff62\": container with ID starting with 4f1029184f2802b982901efecb94360e891d14fcbe79ca9b77151c0c4c0eff62 not found: ID does not exist" Oct 02 11:45:04 crc kubenswrapper[4835]: I1002 11:45:04.275558 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c44a49-de8a-4b00-a7d0-2026b634b58c" path="/var/lib/kubelet/pods/21c44a49-de8a-4b00-a7d0-2026b634b58c/volumes" Oct 02 11:45:04 crc kubenswrapper[4835]: I1002 11:45:04.289640 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k"] Oct 02 11:45:04 crc kubenswrapper[4835]: I1002 11:45:04.300337 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-87m6k"] Oct 02 11:45:06 crc kubenswrapper[4835]: I1002 11:45:06.262878 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c401d6-7b1d-40f4-8570-d0ccb5b778fe" path="/var/lib/kubelet/pods/b4c401d6-7b1d-40f4-8570-d0ccb5b778fe/volumes" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.621333 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tjtt2"] Oct 02 11:45:39 crc kubenswrapper[4835]: E1002 11:45:39.622132 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c44a49-de8a-4b00-a7d0-2026b634b58c" containerName="extract-utilities" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.622143 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c44a49-de8a-4b00-a7d0-2026b634b58c" containerName="extract-utilities" Oct 02 11:45:39 crc kubenswrapper[4835]: E1002 11:45:39.622152 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491d8503-b7e0-41af-9fce-7ea8f1344ff3" containerName="collect-profiles" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.622158 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="491d8503-b7e0-41af-9fce-7ea8f1344ff3" containerName="collect-profiles" Oct 02 11:45:39 crc kubenswrapper[4835]: E1002 11:45:39.622180 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c44a49-de8a-4b00-a7d0-2026b634b58c" containerName="registry-server" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.622186 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c44a49-de8a-4b00-a7d0-2026b634b58c" containerName="registry-server" Oct 02 11:45:39 crc kubenswrapper[4835]: E1002 11:45:39.622196 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c44a49-de8a-4b00-a7d0-2026b634b58c" containerName="extract-content" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.622202 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c44a49-de8a-4b00-a7d0-2026b634b58c" containerName="extract-content" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.622403 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="491d8503-b7e0-41af-9fce-7ea8f1344ff3" containerName="collect-profiles" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.622434 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c44a49-de8a-4b00-a7d0-2026b634b58c" containerName="registry-server" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.623686 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.644756 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjtt2"] Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.696958 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-utilities\") pod \"community-operators-tjtt2\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.697019 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-catalog-content\") pod \"community-operators-tjtt2\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.697161 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlc6t\" (UniqueName: \"kubernetes.io/projected/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-kube-api-access-tlc6t\") pod \"community-operators-tjtt2\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.799395 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-catalog-content\") pod \"community-operators-tjtt2\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.799872 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlc6t\" (UniqueName: \"kubernetes.io/projected/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-kube-api-access-tlc6t\") pod \"community-operators-tjtt2\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.800024 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-utilities\") pod \"community-operators-tjtt2\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.800161 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-catalog-content\") pod \"community-operators-tjtt2\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.800501 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-utilities\") pod \"community-operators-tjtt2\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.831072 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlc6t\" (UniqueName: \"kubernetes.io/projected/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-kube-api-access-tlc6t\") pod \"community-operators-tjtt2\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:39 crc kubenswrapper[4835]: I1002 11:45:39.981018 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:40 crc kubenswrapper[4835]: I1002 11:45:40.507927 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjtt2"] Oct 02 11:45:41 crc kubenswrapper[4835]: I1002 11:45:41.091023 4835 generic.go:334] "Generic (PLEG): container finished" podID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" containerID="8e069776d5c19c110e4c529b290251b9ee5b89f226ed9df8ff0f4a253ee807c9" exitCode=0 Oct 02 11:45:41 crc kubenswrapper[4835]: I1002 11:45:41.091105 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtt2" event={"ID":"e9ad0773-4df5-43d2-9a3e-80f08a620a1b","Type":"ContainerDied","Data":"8e069776d5c19c110e4c529b290251b9ee5b89f226ed9df8ff0f4a253ee807c9"} Oct 02 11:45:41 crc kubenswrapper[4835]: I1002 11:45:41.091485 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtt2" event={"ID":"e9ad0773-4df5-43d2-9a3e-80f08a620a1b","Type":"ContainerStarted","Data":"cb054ba989a5f3730c5dd7c5517fd3412abe820425205024881c952ec901da2e"} Oct 02 11:45:43 crc kubenswrapper[4835]: I1002 11:45:43.112321 4835 generic.go:334] "Generic (PLEG): container finished" podID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" containerID="9d19c1ae7798f9217a701a394fac3d47f33e3d39f697b6b9c20957f56921b509" exitCode=0 Oct 02 11:45:43 crc kubenswrapper[4835]: I1002 11:45:43.112433 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtt2" event={"ID":"e9ad0773-4df5-43d2-9a3e-80f08a620a1b","Type":"ContainerDied","Data":"9d19c1ae7798f9217a701a394fac3d47f33e3d39f697b6b9c20957f56921b509"} Oct 02 11:45:44 crc kubenswrapper[4835]: I1002 11:45:44.122035 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtt2" event={"ID":"e9ad0773-4df5-43d2-9a3e-80f08a620a1b","Type":"ContainerStarted","Data":"393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea"} Oct 02 11:45:44 crc kubenswrapper[4835]: I1002 11:45:44.149804 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tjtt2" podStartSLOduration=2.672271398 podStartE2EDuration="5.149781535s" podCreationTimestamp="2025-10-02 11:45:39 +0000 UTC" firstStartedPulling="2025-10-02 11:45:41.108452744 +0000 UTC m=+3017.668360335" lastFinishedPulling="2025-10-02 11:45:43.585962891 +0000 UTC m=+3020.145870472" observedRunningTime="2025-10-02 11:45:44.141701465 +0000 UTC m=+3020.701609076" watchObservedRunningTime="2025-10-02 11:45:44.149781535 +0000 UTC m=+3020.709689126" Oct 02 11:45:49 crc kubenswrapper[4835]: I1002 11:45:49.981194 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:49 crc kubenswrapper[4835]: I1002 11:45:49.981691 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:50 crc kubenswrapper[4835]: I1002 11:45:50.033887 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:50 crc kubenswrapper[4835]: I1002 11:45:50.227237 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:50 crc kubenswrapper[4835]: I1002 11:45:50.282592 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tjtt2"] Oct 02 11:45:52 crc kubenswrapper[4835]: I1002 11:45:52.187345 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tjtt2" podUID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" containerName="registry-server" containerID="cri-o://393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea" gracePeriod=2 Oct 02 11:45:52 crc kubenswrapper[4835]: I1002 11:45:52.673703 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:52 crc kubenswrapper[4835]: I1002 11:45:52.771874 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-utilities\") pod \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " Oct 02 11:45:52 crc kubenswrapper[4835]: I1002 11:45:52.772043 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-catalog-content\") pod \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " Oct 02 11:45:52 crc kubenswrapper[4835]: I1002 11:45:52.772069 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlc6t\" (UniqueName: \"kubernetes.io/projected/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-kube-api-access-tlc6t\") pod \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\" (UID: \"e9ad0773-4df5-43d2-9a3e-80f08a620a1b\") " Oct 02 11:45:52 crc kubenswrapper[4835]: I1002 11:45:52.773770 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-utilities" (OuterVolumeSpecName: "utilities") pod "e9ad0773-4df5-43d2-9a3e-80f08a620a1b" (UID: "e9ad0773-4df5-43d2-9a3e-80f08a620a1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:45:52 crc kubenswrapper[4835]: I1002 11:45:52.777538 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-kube-api-access-tlc6t" (OuterVolumeSpecName: "kube-api-access-tlc6t") pod "e9ad0773-4df5-43d2-9a3e-80f08a620a1b" (UID: "e9ad0773-4df5-43d2-9a3e-80f08a620a1b"). InnerVolumeSpecName "kube-api-access-tlc6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:45:52 crc kubenswrapper[4835]: I1002 11:45:52.874329 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlc6t\" (UniqueName: \"kubernetes.io/projected/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-kube-api-access-tlc6t\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:52 crc kubenswrapper[4835]: I1002 11:45:52.874589 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.207438 4835 generic.go:334] "Generic (PLEG): container finished" podID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" containerID="393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea" exitCode=0 Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.207513 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjtt2" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.207532 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtt2" event={"ID":"e9ad0773-4df5-43d2-9a3e-80f08a620a1b","Type":"ContainerDied","Data":"393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea"} Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.208000 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtt2" event={"ID":"e9ad0773-4df5-43d2-9a3e-80f08a620a1b","Type":"ContainerDied","Data":"cb054ba989a5f3730c5dd7c5517fd3412abe820425205024881c952ec901da2e"} Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.208021 4835 scope.go:117] "RemoveContainer" containerID="393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.244517 4835 scope.go:117] "RemoveContainer" containerID="9d19c1ae7798f9217a701a394fac3d47f33e3d39f697b6b9c20957f56921b509" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.276053 4835 scope.go:117] "RemoveContainer" containerID="8e069776d5c19c110e4c529b290251b9ee5b89f226ed9df8ff0f4a253ee807c9" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.325417 4835 scope.go:117] "RemoveContainer" containerID="393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea" Oct 02 11:45:53 crc kubenswrapper[4835]: E1002 11:45:53.325933 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea\": container with ID starting with 393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea not found: ID does not exist" containerID="393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.325985 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea"} err="failed to get container status \"393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea\": rpc error: code = NotFound desc = could not find container \"393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea\": container with ID starting with 393587abbc9f9e956bc2e1aac0dde800d5531c738fc418dc4a1fe8d5368c3bea not found: ID does not exist" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.326016 4835 scope.go:117] "RemoveContainer" containerID="9d19c1ae7798f9217a701a394fac3d47f33e3d39f697b6b9c20957f56921b509" Oct 02 11:45:53 crc kubenswrapper[4835]: E1002 11:45:53.326393 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d19c1ae7798f9217a701a394fac3d47f33e3d39f697b6b9c20957f56921b509\": container with ID starting with 9d19c1ae7798f9217a701a394fac3d47f33e3d39f697b6b9c20957f56921b509 not found: ID does not exist" containerID="9d19c1ae7798f9217a701a394fac3d47f33e3d39f697b6b9c20957f56921b509" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.326445 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d19c1ae7798f9217a701a394fac3d47f33e3d39f697b6b9c20957f56921b509"} err="failed to get container status \"9d19c1ae7798f9217a701a394fac3d47f33e3d39f697b6b9c20957f56921b509\": rpc error: code = NotFound desc = could not find container \"9d19c1ae7798f9217a701a394fac3d47f33e3d39f697b6b9c20957f56921b509\": container with ID starting with 9d19c1ae7798f9217a701a394fac3d47f33e3d39f697b6b9c20957f56921b509 not found: ID does not exist" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.326480 4835 scope.go:117] "RemoveContainer" containerID="8e069776d5c19c110e4c529b290251b9ee5b89f226ed9df8ff0f4a253ee807c9" Oct 02 11:45:53 crc kubenswrapper[4835]: E1002 11:45:53.326880 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e069776d5c19c110e4c529b290251b9ee5b89f226ed9df8ff0f4a253ee807c9\": container with ID starting with 8e069776d5c19c110e4c529b290251b9ee5b89f226ed9df8ff0f4a253ee807c9 not found: ID does not exist" containerID="8e069776d5c19c110e4c529b290251b9ee5b89f226ed9df8ff0f4a253ee807c9" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.326915 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e069776d5c19c110e4c529b290251b9ee5b89f226ed9df8ff0f4a253ee807c9"} err="failed to get container status \"8e069776d5c19c110e4c529b290251b9ee5b89f226ed9df8ff0f4a253ee807c9\": rpc error: code = NotFound desc = could not find container \"8e069776d5c19c110e4c529b290251b9ee5b89f226ed9df8ff0f4a253ee807c9\": container with ID starting with 8e069776d5c19c110e4c529b290251b9ee5b89f226ed9df8ff0f4a253ee807c9 not found: ID does not exist" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.392399 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9ad0773-4df5-43d2-9a3e-80f08a620a1b" (UID: "e9ad0773-4df5-43d2-9a3e-80f08a620a1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.490343 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ad0773-4df5-43d2-9a3e-80f08a620a1b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.570571 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tjtt2"] Oct 02 11:45:53 crc kubenswrapper[4835]: I1002 11:45:53.579631 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tjtt2"] Oct 02 11:45:54 crc kubenswrapper[4835]: I1002 11:45:54.269358 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" path="/var/lib/kubelet/pods/e9ad0773-4df5-43d2-9a3e-80f08a620a1b/volumes" Oct 02 11:46:00 crc kubenswrapper[4835]: I1002 11:46:00.000438 4835 scope.go:117] "RemoveContainer" containerID="3c035763cfda7c9e4888012d3918bb05d86aa1ed21793a182fb3b06493a03a9a" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.131745 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5sg"] Oct 02 11:46:20 crc kubenswrapper[4835]: E1002 11:46:20.132754 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" containerName="registry-server" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.132770 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" containerName="registry-server" Oct 02 11:46:20 crc kubenswrapper[4835]: E1002 11:46:20.132811 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" containerName="extract-utilities" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.132844 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" containerName="extract-utilities" Oct 02 11:46:20 crc kubenswrapper[4835]: E1002 11:46:20.132858 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" containerName="extract-content" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.132868 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" containerName="extract-content" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.133127 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ad0773-4df5-43d2-9a3e-80f08a620a1b" containerName="registry-server" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.137673 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.147275 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5sg"] Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.230574 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-utilities\") pod \"redhat-marketplace-2v5sg\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.230679 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-catalog-content\") pod \"redhat-marketplace-2v5sg\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.230721 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhxmf\" (UniqueName: \"kubernetes.io/projected/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-kube-api-access-qhxmf\") pod \"redhat-marketplace-2v5sg\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.331945 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-catalog-content\") pod \"redhat-marketplace-2v5sg\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.332025 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhxmf\" (UniqueName: \"kubernetes.io/projected/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-kube-api-access-qhxmf\") pod \"redhat-marketplace-2v5sg\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.332129 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-utilities\") pod \"redhat-marketplace-2v5sg\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.332422 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-catalog-content\") pod \"redhat-marketplace-2v5sg\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.332510 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-utilities\") pod \"redhat-marketplace-2v5sg\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.358377 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhxmf\" (UniqueName: \"kubernetes.io/projected/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-kube-api-access-qhxmf\") pod \"redhat-marketplace-2v5sg\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.456453 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:20 crc kubenswrapper[4835]: I1002 11:46:20.886608 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5sg"] Oct 02 11:46:21 crc kubenswrapper[4835]: I1002 11:46:21.485427 4835 generic.go:334] "Generic (PLEG): container finished" podID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" containerID="a9539ab4da30b56d87f8ec4a1d033d5fbd448e4ee2b0f83d0d878b6a94b40ef8" exitCode=0 Oct 02 11:46:21 crc kubenswrapper[4835]: I1002 11:46:21.485503 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5sg" event={"ID":"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e","Type":"ContainerDied","Data":"a9539ab4da30b56d87f8ec4a1d033d5fbd448e4ee2b0f83d0d878b6a94b40ef8"} Oct 02 11:46:21 crc kubenswrapper[4835]: I1002 11:46:21.485604 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5sg" event={"ID":"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e","Type":"ContainerStarted","Data":"29ff91f6e58f7bbae76c47b57c530bf1e807a156e1ef282001a1002fd1fdfc7b"} Oct 02 11:46:23 crc kubenswrapper[4835]: I1002 11:46:23.507175 4835 generic.go:334] "Generic (PLEG): container finished" podID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" containerID="3c8b9f0449bc99add6eb78538ccc2eb1487a28092e00034a51fb319bf68c2b83" exitCode=0 Oct 02 11:46:23 crc kubenswrapper[4835]: I1002 11:46:23.507320 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5sg" event={"ID":"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e","Type":"ContainerDied","Data":"3c8b9f0449bc99add6eb78538ccc2eb1487a28092e00034a51fb319bf68c2b83"} Oct 02 11:46:24 crc kubenswrapper[4835]: I1002 11:46:24.517464 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5sg" event={"ID":"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e","Type":"ContainerStarted","Data":"8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40"} Oct 02 11:46:24 crc kubenswrapper[4835]: I1002 11:46:24.542212 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2v5sg" podStartSLOduration=2.114990169 podStartE2EDuration="4.542193741s" podCreationTimestamp="2025-10-02 11:46:20 +0000 UTC" firstStartedPulling="2025-10-02 11:46:21.488802288 +0000 UTC m=+3058.048709869" lastFinishedPulling="2025-10-02 11:46:23.91600585 +0000 UTC m=+3060.475913441" observedRunningTime="2025-10-02 11:46:24.534873541 +0000 UTC m=+3061.094781132" watchObservedRunningTime="2025-10-02 11:46:24.542193741 +0000 UTC m=+3061.102101322" Oct 02 11:46:30 crc kubenswrapper[4835]: I1002 11:46:30.456780 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:30 crc kubenswrapper[4835]: I1002 11:46:30.457403 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:30 crc kubenswrapper[4835]: I1002 11:46:30.504075 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:30 crc kubenswrapper[4835]: I1002 11:46:30.613466 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:30 crc kubenswrapper[4835]: I1002 11:46:30.737885 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5sg"] Oct 02 11:46:32 crc kubenswrapper[4835]: I1002 11:46:32.586314 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2v5sg" podUID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" containerName="registry-server" containerID="cri-o://8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40" gracePeriod=2 Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.001923 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.109123 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-utilities\") pod \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.109595 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhxmf\" (UniqueName: \"kubernetes.io/projected/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-kube-api-access-qhxmf\") pod \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.109922 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-catalog-content\") pod \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\" (UID: \"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e\") " Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.110647 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-utilities" (OuterVolumeSpecName: "utilities") pod "2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" (UID: "2fc82dd3-652c-48ea-84d5-d054dd5a7c2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.121621 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-kube-api-access-qhxmf" (OuterVolumeSpecName: "kube-api-access-qhxmf") pod "2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" (UID: "2fc82dd3-652c-48ea-84d5-d054dd5a7c2e"). InnerVolumeSpecName "kube-api-access-qhxmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.126391 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" (UID: "2fc82dd3-652c-48ea-84d5-d054dd5a7c2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.212591 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.212645 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhxmf\" (UniqueName: \"kubernetes.io/projected/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-kube-api-access-qhxmf\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.212665 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.598043 4835 generic.go:334] "Generic (PLEG): container finished" podID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" containerID="8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40" exitCode=0 Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.598102 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5sg" event={"ID":"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e","Type":"ContainerDied","Data":"8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40"} Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.598136 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5sg" event={"ID":"2fc82dd3-652c-48ea-84d5-d054dd5a7c2e","Type":"ContainerDied","Data":"29ff91f6e58f7bbae76c47b57c530bf1e807a156e1ef282001a1002fd1fdfc7b"} Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.598153 4835 scope.go:117] "RemoveContainer" containerID="8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.598132 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v5sg" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.622095 4835 scope.go:117] "RemoveContainer" containerID="3c8b9f0449bc99add6eb78538ccc2eb1487a28092e00034a51fb319bf68c2b83" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.640170 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5sg"] Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.648684 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5sg"] Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.664807 4835 scope.go:117] "RemoveContainer" containerID="a9539ab4da30b56d87f8ec4a1d033d5fbd448e4ee2b0f83d0d878b6a94b40ef8" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.697832 4835 scope.go:117] "RemoveContainer" containerID="8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40" Oct 02 11:46:33 crc kubenswrapper[4835]: E1002 11:46:33.698817 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40\": container with ID starting with 8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40 not found: ID does not exist" containerID="8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.698970 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40"} err="failed to get container status \"8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40\": rpc error: code = NotFound desc = could not find container \"8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40\": container with ID starting with 8e1be11401cba980065ddc0325da6fa6ef797613e66cfcf68f00b99d6e856b40 not found: ID does not exist" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.699104 4835 scope.go:117] "RemoveContainer" containerID="3c8b9f0449bc99add6eb78538ccc2eb1487a28092e00034a51fb319bf68c2b83" Oct 02 11:46:33 crc kubenswrapper[4835]: E1002 11:46:33.702524 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c8b9f0449bc99add6eb78538ccc2eb1487a28092e00034a51fb319bf68c2b83\": container with ID starting with 3c8b9f0449bc99add6eb78538ccc2eb1487a28092e00034a51fb319bf68c2b83 not found: ID does not exist" containerID="3c8b9f0449bc99add6eb78538ccc2eb1487a28092e00034a51fb319bf68c2b83" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.702562 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8b9f0449bc99add6eb78538ccc2eb1487a28092e00034a51fb319bf68c2b83"} err="failed to get container status \"3c8b9f0449bc99add6eb78538ccc2eb1487a28092e00034a51fb319bf68c2b83\": rpc error: code = NotFound desc = could not find container \"3c8b9f0449bc99add6eb78538ccc2eb1487a28092e00034a51fb319bf68c2b83\": container with ID starting with 3c8b9f0449bc99add6eb78538ccc2eb1487a28092e00034a51fb319bf68c2b83 not found: ID does not exist" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.702587 4835 scope.go:117] "RemoveContainer" containerID="a9539ab4da30b56d87f8ec4a1d033d5fbd448e4ee2b0f83d0d878b6a94b40ef8" Oct 02 11:46:33 crc kubenswrapper[4835]: E1002 11:46:33.705506 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9539ab4da30b56d87f8ec4a1d033d5fbd448e4ee2b0f83d0d878b6a94b40ef8\": container with ID starting with a9539ab4da30b56d87f8ec4a1d033d5fbd448e4ee2b0f83d0d878b6a94b40ef8 not found: ID does not exist" containerID="a9539ab4da30b56d87f8ec4a1d033d5fbd448e4ee2b0f83d0d878b6a94b40ef8" Oct 02 11:46:33 crc kubenswrapper[4835]: I1002 11:46:33.705562 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9539ab4da30b56d87f8ec4a1d033d5fbd448e4ee2b0f83d0d878b6a94b40ef8"} err="failed to get container status \"a9539ab4da30b56d87f8ec4a1d033d5fbd448e4ee2b0f83d0d878b6a94b40ef8\": rpc error: code = NotFound desc = could not find container \"a9539ab4da30b56d87f8ec4a1d033d5fbd448e4ee2b0f83d0d878b6a94b40ef8\": container with ID starting with a9539ab4da30b56d87f8ec4a1d033d5fbd448e4ee2b0f83d0d878b6a94b40ef8 not found: ID does not exist" Oct 02 11:46:34 crc kubenswrapper[4835]: I1002 11:46:34.269085 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" path="/var/lib/kubelet/pods/2fc82dd3-652c-48ea-84d5-d054dd5a7c2e/volumes" Oct 02 11:46:41 crc kubenswrapper[4835]: I1002 11:46:41.983946 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:46:41 crc kubenswrapper[4835]: I1002 11:46:41.984690 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:47:02 crc kubenswrapper[4835]: I1002 11:47:02.861750 4835 generic.go:334] "Generic (PLEG): container finished" podID="758c6988-399c-4303-a629-876f1234d88e" containerID="360da1d520d403e1d8ca5acac65c6f0ca15b02660d5f3e6655b54c42e53e8add" exitCode=0 Oct 02 11:47:02 crc kubenswrapper[4835]: I1002 11:47:02.861876 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" event={"ID":"758c6988-399c-4303-a629-876f1234d88e","Type":"ContainerDied","Data":"360da1d520d403e1d8ca5acac65c6f0ca15b02660d5f3e6655b54c42e53e8add"} Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.272670 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.304050 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-inventory\") pod \"758c6988-399c-4303-a629-876f1234d88e\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.304200 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-0\") pod \"758c6988-399c-4303-a629-876f1234d88e\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.304236 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-0\") pod \"758c6988-399c-4303-a629-876f1234d88e\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.304274 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-ceph-nova-0\") pod \"758c6988-399c-4303-a629-876f1234d88e\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.304304 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-1\") pod \"758c6988-399c-4303-a629-876f1234d88e\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.304345 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ceph\") pod \"758c6988-399c-4303-a629-876f1234d88e\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.304386 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-nova-extra-config-0\") pod \"758c6988-399c-4303-a629-876f1234d88e\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.304403 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ssh-key\") pod \"758c6988-399c-4303-a629-876f1234d88e\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.304492 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-custom-ceph-combined-ca-bundle\") pod \"758c6988-399c-4303-a629-876f1234d88e\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.304511 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfd9w\" (UniqueName: \"kubernetes.io/projected/758c6988-399c-4303-a629-876f1234d88e-kube-api-access-xfd9w\") pod \"758c6988-399c-4303-a629-876f1234d88e\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.304542 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-1\") pod \"758c6988-399c-4303-a629-876f1234d88e\" (UID: \"758c6988-399c-4303-a629-876f1234d88e\") " Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.326531 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/758c6988-399c-4303-a629-876f1234d88e-kube-api-access-xfd9w" (OuterVolumeSpecName: "kube-api-access-xfd9w") pod "758c6988-399c-4303-a629-876f1234d88e" (UID: "758c6988-399c-4303-a629-876f1234d88e"). InnerVolumeSpecName "kube-api-access-xfd9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.330718 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "758c6988-399c-4303-a629-876f1234d88e" (UID: "758c6988-399c-4303-a629-876f1234d88e"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.331346 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ceph" (OuterVolumeSpecName: "ceph") pod "758c6988-399c-4303-a629-876f1234d88e" (UID: "758c6988-399c-4303-a629-876f1234d88e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.343673 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-inventory" (OuterVolumeSpecName: "inventory") pod "758c6988-399c-4303-a629-876f1234d88e" (UID: "758c6988-399c-4303-a629-876f1234d88e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.344045 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "758c6988-399c-4303-a629-876f1234d88e" (UID: "758c6988-399c-4303-a629-876f1234d88e"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.346318 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "758c6988-399c-4303-a629-876f1234d88e" (UID: "758c6988-399c-4303-a629-876f1234d88e"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.347108 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "758c6988-399c-4303-a629-876f1234d88e" (UID: "758c6988-399c-4303-a629-876f1234d88e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.352003 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "758c6988-399c-4303-a629-876f1234d88e" (UID: "758c6988-399c-4303-a629-876f1234d88e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.370545 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "758c6988-399c-4303-a629-876f1234d88e" (UID: "758c6988-399c-4303-a629-876f1234d88e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.371754 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "758c6988-399c-4303-a629-876f1234d88e" (UID: "758c6988-399c-4303-a629-876f1234d88e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.378380 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "758c6988-399c-4303-a629-876f1234d88e" (UID: "758c6988-399c-4303-a629-876f1234d88e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.407047 4835 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.407094 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfd9w\" (UniqueName: \"kubernetes.io/projected/758c6988-399c-4303-a629-876f1234d88e-kube-api-access-xfd9w\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.407105 4835 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.407115 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.407129 4835 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.407139 4835 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.407152 4835 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.407163 4835 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.407173 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.407182 4835 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/758c6988-399c-4303-a629-876f1234d88e-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.407191 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/758c6988-399c-4303-a629-876f1234d88e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.895322 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" event={"ID":"758c6988-399c-4303-a629-876f1234d88e","Type":"ContainerDied","Data":"12abd15663c77b0af21f700f941d050851396bd1286851a58723332ba8853e63"} Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.895594 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d" Oct 02 11:47:04 crc kubenswrapper[4835]: I1002 11:47:04.895612 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12abd15663c77b0af21f700f941d050851396bd1286851a58723332ba8853e63" Oct 02 11:47:11 crc kubenswrapper[4835]: I1002 11:47:11.984045 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:47:11 crc kubenswrapper[4835]: I1002 11:47:11.985118 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.539636 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 11:47:19 crc kubenswrapper[4835]: E1002 11:47:19.540476 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" containerName="extract-content" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.540493 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" containerName="extract-content" Oct 02 11:47:19 crc kubenswrapper[4835]: E1002 11:47:19.540513 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" containerName="registry-server" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.540521 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" containerName="registry-server" Oct 02 11:47:19 crc kubenswrapper[4835]: E1002 11:47:19.540538 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" containerName="extract-utilities" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.540544 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" containerName="extract-utilities" Oct 02 11:47:19 crc kubenswrapper[4835]: E1002 11:47:19.540557 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758c6988-399c-4303-a629-876f1234d88e" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.540563 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="758c6988-399c-4303-a629-876f1234d88e" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.540760 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="758c6988-399c-4303-a629-876f1234d88e" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.540770 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc82dd3-652c-48ea-84d5-d054dd5a7c2e" containerName="registry-server" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.541708 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.544047 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.548671 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.552025 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.601723 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.601809 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.601839 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.601871 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmvbv\" (UniqueName: \"kubernetes.io/projected/3713b30c-b4df-4bce-912c-f8161b5ab949-kube-api-access-zmvbv\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.601896 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.601932 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.601969 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-run\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.601992 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.602023 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.602061 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.602089 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.602109 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.602134 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3713b30c-b4df-4bce-912c-f8161b5ab949-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.602155 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-sys\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.602173 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-dev\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.602249 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.607388 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.609166 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.611260 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.625086 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.704146 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.704568 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.704680 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3713b30c-b4df-4bce-912c-f8161b5ab949-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.704399 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.704757 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5351da7e-bf35-4614-95f5-72fb10c1b920-ceph\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.704949 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-sys\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705031 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-dev\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705077 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-config-data\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705150 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-sys\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705165 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-dev\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705208 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705261 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705303 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705000 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-sys\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705430 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705484 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705519 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705561 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-run\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705615 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xpwh\" (UniqueName: \"kubernetes.io/projected/5351da7e-bf35-4614-95f5-72fb10c1b920-kube-api-access-8xpwh\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705629 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705643 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705694 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.705712 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706056 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmvbv\" (UniqueName: \"kubernetes.io/projected/3713b30c-b4df-4bce-912c-f8161b5ab949-kube-api-access-zmvbv\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706077 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706097 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-dev\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706118 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-scripts\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706132 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-lib-modules\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706157 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706171 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706284 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706310 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706374 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706436 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-run\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706470 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706499 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706526 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-run\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706577 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706659 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706671 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706706 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.706863 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3713b30c-b4df-4bce-912c-f8161b5ab949-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.709661 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.710114 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.710391 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3713b30c-b4df-4bce-912c-f8161b5ab949-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.711775 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.718146 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3713b30c-b4df-4bce-912c-f8161b5ab949-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.735169 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmvbv\" (UniqueName: \"kubernetes.io/projected/3713b30c-b4df-4bce-912c-f8161b5ab949-kube-api-access-zmvbv\") pod \"cinder-volume-volume1-0\" (UID: \"3713b30c-b4df-4bce-912c-f8161b5ab949\") " pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808500 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808565 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808570 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5351da7e-bf35-4614-95f5-72fb10c1b920-ceph\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808656 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808671 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-config-data\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808704 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-sys\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808746 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808778 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808826 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808856 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808890 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-run\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808916 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xpwh\" (UniqueName: \"kubernetes.io/projected/5351da7e-bf35-4614-95f5-72fb10c1b920-kube-api-access-8xpwh\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808949 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808983 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-dev\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.808999 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-scripts\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.809017 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-lib-modules\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.809053 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.809133 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-run\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.809203 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-sys\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.809279 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.809335 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.809369 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.809415 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.809459 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-dev\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.810388 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5351da7e-bf35-4614-95f5-72fb10c1b920-lib-modules\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.812683 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-config-data\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.813790 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.814160 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-scripts\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.814804 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5351da7e-bf35-4614-95f5-72fb10c1b920-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.820096 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5351da7e-bf35-4614-95f5-72fb10c1b920-ceph\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.830670 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xpwh\" (UniqueName: \"kubernetes.io/projected/5351da7e-bf35-4614-95f5-72fb10c1b920-kube-api-access-8xpwh\") pod \"cinder-backup-0\" (UID: \"5351da7e-bf35-4614-95f5-72fb10c1b920\") " pod="openstack/cinder-backup-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.866156 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:19 crc kubenswrapper[4835]: I1002 11:47:19.924796 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.189871 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-s9l7s"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.193163 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-s9l7s" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.205647 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-s9l7s"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.267673 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9bf6dfccc-kgdvz"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.269162 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.272030 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-z69vw" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.273268 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.273446 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.273548 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.273987 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9bf6dfccc-kgdvz"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.349940 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.350507 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-logs\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.350566 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-scripts\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.350622 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-horizon-secret-key\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.350647 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlbj6\" (UniqueName: \"kubernetes.io/projected/bb924623-b065-4a9c-b3cb-635e87d989d8-kube-api-access-zlbj6\") pod \"manila-db-create-s9l7s\" (UID: \"bb924623-b065-4a9c-b3cb-635e87d989d8\") " pod="openstack/manila-db-create-s9l7s" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.350687 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-config-data\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.350721 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l55qv\" (UniqueName: \"kubernetes.io/projected/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-kube-api-access-l55qv\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.351505 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.353850 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.354302 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.354547 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.354562 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rzbh7" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.366756 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c5fc895cc-7hsqv"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.368723 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.382140 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.416854 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c5fc895cc-7hsqv"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452121 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-config-data\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452177 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-ceph\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452202 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l55qv\" (UniqueName: \"kubernetes.io/projected/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-kube-api-access-l55qv\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452219 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnc5n\" (UniqueName: \"kubernetes.io/projected/52db0e3d-e5a8-4f34-ba3d-283451e3c843-kube-api-access-wnc5n\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452253 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v4jd\" (UniqueName: \"kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-kube-api-access-4v4jd\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452276 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-scripts\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452325 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452355 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-logs\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452380 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452407 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52db0e3d-e5a8-4f34-ba3d-283451e3c843-logs\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452424 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-scripts\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452458 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452478 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-config-data\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452495 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-logs\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452517 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-scripts\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452545 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452562 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/52db0e3d-e5a8-4f34-ba3d-283451e3c843-horizon-secret-key\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452579 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-horizon-secret-key\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452606 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-config-data\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452623 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlbj6\" (UniqueName: \"kubernetes.io/projected/bb924623-b065-4a9c-b3cb-635e87d989d8-kube-api-access-zlbj6\") pod \"manila-db-create-s9l7s\" (UID: \"bb924623-b065-4a9c-b3cb-635e87d989d8\") " pod="openstack/manila-db-create-s9l7s" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.452969 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-logs\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.453461 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-config-data\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.453894 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-scripts\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.464435 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.465045 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-horizon-secret-key\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.466200 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.469465 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.469770 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.481595 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlbj6\" (UniqueName: \"kubernetes.io/projected/bb924623-b065-4a9c-b3cb-635e87d989d8-kube-api-access-zlbj6\") pod \"manila-db-create-s9l7s\" (UID: \"bb924623-b065-4a9c-b3cb-635e87d989d8\") " pod="openstack/manila-db-create-s9l7s" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.485796 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l55qv\" (UniqueName: \"kubernetes.io/projected/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-kube-api-access-l55qv\") pod \"horizon-9bf6dfccc-kgdvz\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.494498 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.520061 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-s9l7s" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-ceph\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566072 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnc5n\" (UniqueName: \"kubernetes.io/projected/52db0e3d-e5a8-4f34-ba3d-283451e3c843-kube-api-access-wnc5n\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566099 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v4jd\" (UniqueName: \"kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-kube-api-access-4v4jd\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566125 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566154 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-scripts\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566300 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566338 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566393 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566448 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566468 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgtx9\" (UniqueName: \"kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-kube-api-access-zgtx9\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566487 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.566526 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52db0e3d-e5a8-4f34-ba3d-283451e3c843-logs\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.567203 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.567246 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.567509 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52db0e3d-e5a8-4f34-ba3d-283451e3c843-logs\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.568092 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-config-data\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.569849 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-logs\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.569879 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-logs\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.569913 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-scripts\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.569943 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.570009 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.570032 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-scripts\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.570035 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/52db0e3d-e5a8-4f34-ba3d-283451e3c843-horizon-secret-key\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.570083 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-config-data\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.570102 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.570127 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.570908 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-scripts\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.571156 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-logs\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.572482 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.573806 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-ceph\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.575208 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.581113 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/52db0e3d-e5a8-4f34-ba3d-283451e3c843-horizon-secret-key\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.581352 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-config-data\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.581650 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.582822 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-config-data\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.587767 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v4jd\" (UniqueName: \"kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-kube-api-access-4v4jd\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.588759 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnc5n\" (UniqueName: \"kubernetes.io/projected/52db0e3d-e5a8-4f34-ba3d-283451e3c843-kube-api-access-wnc5n\") pod \"horizon-6c5fc895cc-7hsqv\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.597662 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.603931 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.607277 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.672265 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.672327 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.672359 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.672387 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgtx9\" (UniqueName: \"kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-kube-api-access-zgtx9\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.672403 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.672461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-logs\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.672485 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.672515 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.672533 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.672861 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.676678 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-logs\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.676982 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.681958 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.682556 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.683617 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.688449 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.691666 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.693601 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.698566 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgtx9\" (UniqueName: \"kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-kube-api-access-zgtx9\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.711327 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.724681 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.740907 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.785749 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:20 crc kubenswrapper[4835]: I1002 11:47:20.841591 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-s9l7s"] Oct 02 11:47:20 crc kubenswrapper[4835]: W1002 11:47:20.983380 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb924623_b065_4a9c_b3cb_635e87d989d8.slice/crio-2b88382d673ff6548aa19b7bbdaa3dbe791d01ca5990a3294a231fdcf511a291 WatchSource:0}: Error finding container 2b88382d673ff6548aa19b7bbdaa3dbe791d01ca5990a3294a231fdcf511a291: Status 404 returned error can't find the container with id 2b88382d673ff6548aa19b7bbdaa3dbe791d01ca5990a3294a231fdcf511a291 Oct 02 11:47:21 crc kubenswrapper[4835]: I1002 11:47:21.106871 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"3713b30c-b4df-4bce-912c-f8161b5ab949","Type":"ContainerStarted","Data":"8f254e27b9986dbb9df4d4e0556f9aec0179554e6d75e2a56bd0adf8e28fa904"} Oct 02 11:47:21 crc kubenswrapper[4835]: I1002 11:47:21.111938 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-s9l7s" event={"ID":"bb924623-b065-4a9c-b3cb-635e87d989d8","Type":"ContainerStarted","Data":"2b88382d673ff6548aa19b7bbdaa3dbe791d01ca5990a3294a231fdcf511a291"} Oct 02 11:47:21 crc kubenswrapper[4835]: I1002 11:47:21.124651 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5351da7e-bf35-4614-95f5-72fb10c1b920","Type":"ContainerStarted","Data":"c7cfeb23b1c80dff8edd3ced5a3550ac1a003d50ef72a986abe657d0f90cc905"} Oct 02 11:47:21 crc kubenswrapper[4835]: I1002 11:47:21.240300 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9bf6dfccc-kgdvz"] Oct 02 11:47:21 crc kubenswrapper[4835]: I1002 11:47:21.416905 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c5fc895cc-7hsqv"] Oct 02 11:47:21 crc kubenswrapper[4835]: I1002 11:47:21.476881 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:47:21 crc kubenswrapper[4835]: W1002 11:47:21.481833 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bd8b146_5a2c_436e_add6_5adc721b7df9.slice/crio-9963882b973c2be036baa9b29751602eb38e6ebcd01897c192f7a48d28811b37 WatchSource:0}: Error finding container 9963882b973c2be036baa9b29751602eb38e6ebcd01897c192f7a48d28811b37: Status 404 returned error can't find the container with id 9963882b973c2be036baa9b29751602eb38e6ebcd01897c192f7a48d28811b37 Oct 02 11:47:21 crc kubenswrapper[4835]: I1002 11:47:21.655938 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.134981 4835 generic.go:334] "Generic (PLEG): container finished" podID="bb924623-b065-4a9c-b3cb-635e87d989d8" containerID="e34305629e2eca88dc690f56867920716f3a340dcd93ca12703c32674aea3656" exitCode=0 Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.135529 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-s9l7s" event={"ID":"bb924623-b065-4a9c-b3cb-635e87d989d8","Type":"ContainerDied","Data":"e34305629e2eca88dc690f56867920716f3a340dcd93ca12703c32674aea3656"} Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.136332 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af640d37-f725-423e-b60d-8c160d6574e1","Type":"ContainerStarted","Data":"536c133e683ed5e37f7598bf790cbfec96f261eaf38e1734cf92357075e36357"} Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.139527 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fc895cc-7hsqv" event={"ID":"52db0e3d-e5a8-4f34-ba3d-283451e3c843","Type":"ContainerStarted","Data":"fd5cd0c743c491d416cd02aa194d947d3a61ddf909d209ec6ab4804b4cfca08b"} Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.141436 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3bd8b146-5a2c-436e-add6-5adc721b7df9","Type":"ContainerStarted","Data":"9963882b973c2be036baa9b29751602eb38e6ebcd01897c192f7a48d28811b37"} Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.143103 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bf6dfccc-kgdvz" event={"ID":"c9fc8d63-9b88-4a08-bada-236a3c3d1bda","Type":"ContainerStarted","Data":"cdf1d981236eea9b2fb3abc531e8dd682c4e549a8e8541a3806c860103437a08"} Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.571508 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c5fc895cc-7hsqv"] Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.598440 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-74c74d79b4-rfzlk"] Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.599930 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.606848 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.616652 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.645395 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74c74d79b4-rfzlk"] Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.689584 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9bf6dfccc-kgdvz"] Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.699347 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.759346 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-tls-certs\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.759422 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37beff21-75a0-4297-a84b-9a34ccb1d2e0-logs\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.759466 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxkx\" (UniqueName: \"kubernetes.io/projected/37beff21-75a0-4297-a84b-9a34ccb1d2e0-kube-api-access-zrxkx\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.759500 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-combined-ca-bundle\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.759529 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-scripts\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.759563 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-secret-key\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.759580 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-config-data\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.819301 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d655558cb-84687"] Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.909695 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-tls-certs\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.909846 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37beff21-75a0-4297-a84b-9a34ccb1d2e0-logs\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.909937 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxkx\" (UniqueName: \"kubernetes.io/projected/37beff21-75a0-4297-a84b-9a34ccb1d2e0-kube-api-access-zrxkx\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.910011 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-combined-ca-bundle\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.910049 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-scripts\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.910112 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-secret-key\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.910135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-config-data\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.911471 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-config-data\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.918338 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.934718 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37beff21-75a0-4297-a84b-9a34ccb1d2e0-logs\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.959797 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-scripts\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:22 crc kubenswrapper[4835]: I1002 11:47:22.965355 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-tls-certs\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.011999 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-combined-ca-bundle\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.014096 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4fd0f229-d269-4fa9-bd48-0909ce1ce941-horizon-secret-key\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.014147 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd0f229-d269-4fa9-bd48-0909ce1ce941-combined-ca-bundle\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.014174 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fd0f229-d269-4fa9-bd48-0909ce1ce941-config-data\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.014255 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q48t5\" (UniqueName: \"kubernetes.io/projected/4fd0f229-d269-4fa9-bd48-0909ce1ce941-kube-api-access-q48t5\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.014285 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fd0f229-d269-4fa9-bd48-0909ce1ce941-logs\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.014318 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fd0f229-d269-4fa9-bd48-0909ce1ce941-horizon-tls-certs\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.014363 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fd0f229-d269-4fa9-bd48-0909ce1ce941-scripts\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.014773 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-secret-key\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.024077 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d655558cb-84687"] Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.044559 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxkx\" (UniqueName: \"kubernetes.io/projected/37beff21-75a0-4297-a84b-9a34ccb1d2e0-kube-api-access-zrxkx\") pod \"horizon-74c74d79b4-rfzlk\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.117310 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4fd0f229-d269-4fa9-bd48-0909ce1ce941-horizon-secret-key\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.117361 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd0f229-d269-4fa9-bd48-0909ce1ce941-combined-ca-bundle\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.117385 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fd0f229-d269-4fa9-bd48-0909ce1ce941-config-data\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.117439 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q48t5\" (UniqueName: \"kubernetes.io/projected/4fd0f229-d269-4fa9-bd48-0909ce1ce941-kube-api-access-q48t5\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.117461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fd0f229-d269-4fa9-bd48-0909ce1ce941-logs\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.117489 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fd0f229-d269-4fa9-bd48-0909ce1ce941-horizon-tls-certs\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.117536 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fd0f229-d269-4fa9-bd48-0909ce1ce941-scripts\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.118386 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4fd0f229-d269-4fa9-bd48-0909ce1ce941-scripts\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.119768 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fd0f229-d269-4fa9-bd48-0909ce1ce941-config-data\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.120627 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fd0f229-d269-4fa9-bd48-0909ce1ce941-logs\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.123153 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fd0f229-d269-4fa9-bd48-0909ce1ce941-horizon-tls-certs\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.123911 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd0f229-d269-4fa9-bd48-0909ce1ce941-combined-ca-bundle\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.126602 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4fd0f229-d269-4fa9-bd48-0909ce1ce941-horizon-secret-key\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.139528 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q48t5\" (UniqueName: \"kubernetes.io/projected/4fd0f229-d269-4fa9-bd48-0909ce1ce941-kube-api-access-q48t5\") pod \"horizon-6d655558cb-84687\" (UID: \"4fd0f229-d269-4fa9-bd48-0909ce1ce941\") " pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.170512 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"3713b30c-b4df-4bce-912c-f8161b5ab949","Type":"ContainerStarted","Data":"ecbadac8ad60da6aafd26391375a395879218685e5cec114f9b309cb9cebb828"} Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.170562 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"3713b30c-b4df-4bce-912c-f8161b5ab949","Type":"ContainerStarted","Data":"697fef5423412750ae2a02ecb12cfd58e12964010187d3dd8331083c93b9505b"} Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.174885 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af640d37-f725-423e-b60d-8c160d6574e1","Type":"ContainerStarted","Data":"c93f3b0bb55a1100d06a4fec054595d85f98624a1994266391a552fe9b9c65a3"} Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.180378 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5351da7e-bf35-4614-95f5-72fb10c1b920","Type":"ContainerStarted","Data":"1747d5278f635d63ba22a20094d60023b53351c69e9a5dc2c28c012a91208bc7"} Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.180406 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5351da7e-bf35-4614-95f5-72fb10c1b920","Type":"ContainerStarted","Data":"2c20c79398856cfa0f7473d0c4a448e1c7052d2be101105de2ce39929a4b292c"} Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.190077 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3bd8b146-5a2c-436e-add6-5adc721b7df9","Type":"ContainerStarted","Data":"f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91"} Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.193410 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.859051981 podStartE2EDuration="4.193389871s" podCreationTimestamp="2025-10-02 11:47:19 +0000 UTC" firstStartedPulling="2025-10-02 11:47:20.589827213 +0000 UTC m=+3117.149734794" lastFinishedPulling="2025-10-02 11:47:21.924165103 +0000 UTC m=+3118.484072684" observedRunningTime="2025-10-02 11:47:23.191663711 +0000 UTC m=+3119.751571302" watchObservedRunningTime="2025-10-02 11:47:23.193389871 +0000 UTC m=+3119.753297472" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.230898 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.138640166 podStartE2EDuration="4.230854159s" podCreationTimestamp="2025-10-02 11:47:19 +0000 UTC" firstStartedPulling="2025-10-02 11:47:20.826995668 +0000 UTC m=+3117.386903249" lastFinishedPulling="2025-10-02 11:47:21.919209661 +0000 UTC m=+3118.479117242" observedRunningTime="2025-10-02 11:47:23.211150092 +0000 UTC m=+3119.771057673" watchObservedRunningTime="2025-10-02 11:47:23.230854159 +0000 UTC m=+3119.790761740" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.237926 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.414322 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.635273 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-s9l7s" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.688272 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74c74d79b4-rfzlk"] Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.749369 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlbj6\" (UniqueName: \"kubernetes.io/projected/bb924623-b065-4a9c-b3cb-635e87d989d8-kube-api-access-zlbj6\") pod \"bb924623-b065-4a9c-b3cb-635e87d989d8\" (UID: \"bb924623-b065-4a9c-b3cb-635e87d989d8\") " Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.758400 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb924623-b065-4a9c-b3cb-635e87d989d8-kube-api-access-zlbj6" (OuterVolumeSpecName: "kube-api-access-zlbj6") pod "bb924623-b065-4a9c-b3cb-635e87d989d8" (UID: "bb924623-b065-4a9c-b3cb-635e87d989d8"). InnerVolumeSpecName "kube-api-access-zlbj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:47:23 crc kubenswrapper[4835]: I1002 11:47:23.851867 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlbj6\" (UniqueName: \"kubernetes.io/projected/bb924623-b065-4a9c-b3cb-635e87d989d8-kube-api-access-zlbj6\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.022785 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d655558cb-84687"] Oct 02 11:47:24 crc kubenswrapper[4835]: W1002 11:47:24.034583 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fd0f229_d269_4fa9_bd48_0909ce1ce941.slice/crio-4522f41215b6d67f36ab5d35b46f86d3997e033b9a4c3eee6be6e11a5c7067bc WatchSource:0}: Error finding container 4522f41215b6d67f36ab5d35b46f86d3997e033b9a4c3eee6be6e11a5c7067bc: Status 404 returned error can't find the container with id 4522f41215b6d67f36ab5d35b46f86d3997e033b9a4c3eee6be6e11a5c7067bc Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.207782 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3bd8b146-5a2c-436e-add6-5adc721b7df9","Type":"ContainerStarted","Data":"b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4"} Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.208077 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3bd8b146-5a2c-436e-add6-5adc721b7df9" containerName="glance-log" containerID="cri-o://f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91" gracePeriod=30 Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.208586 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3bd8b146-5a2c-436e-add6-5adc721b7df9" containerName="glance-httpd" containerID="cri-o://b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4" gracePeriod=30 Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.209771 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74c74d79b4-rfzlk" event={"ID":"37beff21-75a0-4297-a84b-9a34ccb1d2e0","Type":"ContainerStarted","Data":"82e0441e39723c666697d56aa0ce7f5be7c4d4b9a8181f4de9d928ea063493af"} Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.212016 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d655558cb-84687" event={"ID":"4fd0f229-d269-4fa9-bd48-0909ce1ce941","Type":"ContainerStarted","Data":"4522f41215b6d67f36ab5d35b46f86d3997e033b9a4c3eee6be6e11a5c7067bc"} Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.221617 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-s9l7s" Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.221607 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-s9l7s" event={"ID":"bb924623-b065-4a9c-b3cb-635e87d989d8","Type":"ContainerDied","Data":"2b88382d673ff6548aa19b7bbdaa3dbe791d01ca5990a3294a231fdcf511a291"} Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.222341 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b88382d673ff6548aa19b7bbdaa3dbe791d01ca5990a3294a231fdcf511a291" Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.239934 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.239915889 podStartE2EDuration="4.239915889s" podCreationTimestamp="2025-10-02 11:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:47:24.234154893 +0000 UTC m=+3120.794062494" watchObservedRunningTime="2025-10-02 11:47:24.239915889 +0000 UTC m=+3120.799823470" Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.274127 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="af640d37-f725-423e-b60d-8c160d6574e1" containerName="glance-log" containerID="cri-o://c93f3b0bb55a1100d06a4fec054595d85f98624a1994266391a552fe9b9c65a3" gracePeriod=30 Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.274289 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="af640d37-f725-423e-b60d-8c160d6574e1" containerName="glance-httpd" containerID="cri-o://403e898e566760a2a301ca479a67e4b6dc7d6b17b91a62d2f4972970dcce2ee8" gracePeriod=30 Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.275201 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af640d37-f725-423e-b60d-8c160d6574e1","Type":"ContainerStarted","Data":"403e898e566760a2a301ca479a67e4b6dc7d6b17b91a62d2f4972970dcce2ee8"} Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.432127 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.432073648 podStartE2EDuration="4.432073648s" podCreationTimestamp="2025-10-02 11:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:47:24.419785955 +0000 UTC m=+3120.979693556" watchObservedRunningTime="2025-10-02 11:47:24.432073648 +0000 UTC m=+3120.991981249" Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.866783 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.920953 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:47:24 crc kubenswrapper[4835]: I1002 11:47:24.924925 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.083641 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-logs\") pod \"3bd8b146-5a2c-436e-add6-5adc721b7df9\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.083758 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v4jd\" (UniqueName: \"kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-kube-api-access-4v4jd\") pod \"3bd8b146-5a2c-436e-add6-5adc721b7df9\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.083889 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-httpd-run\") pod \"3bd8b146-5a2c-436e-add6-5adc721b7df9\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.083919 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-combined-ca-bundle\") pod \"3bd8b146-5a2c-436e-add6-5adc721b7df9\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.083937 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-config-data\") pod \"3bd8b146-5a2c-436e-add6-5adc721b7df9\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.083962 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-ceph\") pod \"3bd8b146-5a2c-436e-add6-5adc721b7df9\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.084059 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-public-tls-certs\") pod \"3bd8b146-5a2c-436e-add6-5adc721b7df9\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.084114 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"3bd8b146-5a2c-436e-add6-5adc721b7df9\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.084163 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-scripts\") pod \"3bd8b146-5a2c-436e-add6-5adc721b7df9\" (UID: \"3bd8b146-5a2c-436e-add6-5adc721b7df9\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.092530 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "3bd8b146-5a2c-436e-add6-5adc721b7df9" (UID: "3bd8b146-5a2c-436e-add6-5adc721b7df9"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.094490 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3bd8b146-5a2c-436e-add6-5adc721b7df9" (UID: "3bd8b146-5a2c-436e-add6-5adc721b7df9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.094745 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-logs" (OuterVolumeSpecName: "logs") pod "3bd8b146-5a2c-436e-add6-5adc721b7df9" (UID: "3bd8b146-5a2c-436e-add6-5adc721b7df9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.112857 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-kube-api-access-4v4jd" (OuterVolumeSpecName: "kube-api-access-4v4jd") pod "3bd8b146-5a2c-436e-add6-5adc721b7df9" (UID: "3bd8b146-5a2c-436e-add6-5adc721b7df9"). InnerVolumeSpecName "kube-api-access-4v4jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.112997 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-ceph" (OuterVolumeSpecName: "ceph") pod "3bd8b146-5a2c-436e-add6-5adc721b7df9" (UID: "3bd8b146-5a2c-436e-add6-5adc721b7df9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.113325 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-scripts" (OuterVolumeSpecName: "scripts") pod "3bd8b146-5a2c-436e-add6-5adc721b7df9" (UID: "3bd8b146-5a2c-436e-add6-5adc721b7df9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.115743 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bd8b146-5a2c-436e-add6-5adc721b7df9" (UID: "3bd8b146-5a2c-436e-add6-5adc721b7df9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.149123 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3bd8b146-5a2c-436e-add6-5adc721b7df9" (UID: "3bd8b146-5a2c-436e-add6-5adc721b7df9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.156165 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-config-data" (OuterVolumeSpecName: "config-data") pod "3bd8b146-5a2c-436e-add6-5adc721b7df9" (UID: "3bd8b146-5a2c-436e-add6-5adc721b7df9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.186721 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.186760 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.186774 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.186786 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.186796 4835 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.186837 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.186848 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd8b146-5a2c-436e-add6-5adc721b7df9-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.186857 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd8b146-5a2c-436e-add6-5adc721b7df9-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.186870 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v4jd\" (UniqueName: \"kubernetes.io/projected/3bd8b146-5a2c-436e-add6-5adc721b7df9-kube-api-access-4v4jd\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.222655 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.273191 4835 generic.go:334] "Generic (PLEG): container finished" podID="3bd8b146-5a2c-436e-add6-5adc721b7df9" containerID="b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4" exitCode=0 Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.273436 4835 generic.go:334] "Generic (PLEG): container finished" podID="3bd8b146-5a2c-436e-add6-5adc721b7df9" containerID="f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91" exitCode=143 Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.273512 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3bd8b146-5a2c-436e-add6-5adc721b7df9","Type":"ContainerDied","Data":"b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4"} Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.273614 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3bd8b146-5a2c-436e-add6-5adc721b7df9","Type":"ContainerDied","Data":"f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91"} Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.273634 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3bd8b146-5a2c-436e-add6-5adc721b7df9","Type":"ContainerDied","Data":"9963882b973c2be036baa9b29751602eb38e6ebcd01897c192f7a48d28811b37"} Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.273654 4835 scope.go:117] "RemoveContainer" containerID="b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.273880 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.284592 4835 generic.go:334] "Generic (PLEG): container finished" podID="af640d37-f725-423e-b60d-8c160d6574e1" containerID="403e898e566760a2a301ca479a67e4b6dc7d6b17b91a62d2f4972970dcce2ee8" exitCode=0 Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.284632 4835 generic.go:334] "Generic (PLEG): container finished" podID="af640d37-f725-423e-b60d-8c160d6574e1" containerID="c93f3b0bb55a1100d06a4fec054595d85f98624a1994266391a552fe9b9c65a3" exitCode=143 Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.285347 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af640d37-f725-423e-b60d-8c160d6574e1","Type":"ContainerDied","Data":"403e898e566760a2a301ca479a67e4b6dc7d6b17b91a62d2f4972970dcce2ee8"} Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.285486 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af640d37-f725-423e-b60d-8c160d6574e1","Type":"ContainerDied","Data":"c93f3b0bb55a1100d06a4fec054595d85f98624a1994266391a552fe9b9c65a3"} Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.293260 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.335167 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.358147 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.368078 4835 scope.go:117] "RemoveContainer" containerID="f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.387270 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:47:25 crc kubenswrapper[4835]: E1002 11:47:25.387771 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb924623-b065-4a9c-b3cb-635e87d989d8" containerName="mariadb-database-create" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.387796 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb924623-b065-4a9c-b3cb-635e87d989d8" containerName="mariadb-database-create" Oct 02 11:47:25 crc kubenswrapper[4835]: E1002 11:47:25.387830 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd8b146-5a2c-436e-add6-5adc721b7df9" containerName="glance-log" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.387840 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd8b146-5a2c-436e-add6-5adc721b7df9" containerName="glance-log" Oct 02 11:47:25 crc kubenswrapper[4835]: E1002 11:47:25.387876 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd8b146-5a2c-436e-add6-5adc721b7df9" containerName="glance-httpd" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.387886 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd8b146-5a2c-436e-add6-5adc721b7df9" containerName="glance-httpd" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.388123 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb924623-b065-4a9c-b3cb-635e87d989d8" containerName="mariadb-database-create" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.388162 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd8b146-5a2c-436e-add6-5adc721b7df9" containerName="glance-log" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.388176 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd8b146-5a2c-436e-add6-5adc721b7df9" containerName="glance-httpd" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.389596 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.395887 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.396265 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.399717 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.437412 4835 scope.go:117] "RemoveContainer" containerID="b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4" Oct 02 11:47:25 crc kubenswrapper[4835]: E1002 11:47:25.437946 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4\": container with ID starting with b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4 not found: ID does not exist" containerID="b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.437987 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4"} err="failed to get container status \"b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4\": rpc error: code = NotFound desc = could not find container \"b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4\": container with ID starting with b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4 not found: ID does not exist" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.438030 4835 scope.go:117] "RemoveContainer" containerID="f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91" Oct 02 11:47:25 crc kubenswrapper[4835]: E1002 11:47:25.438966 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91\": container with ID starting with f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91 not found: ID does not exist" containerID="f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.438997 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91"} err="failed to get container status \"f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91\": rpc error: code = NotFound desc = could not find container \"f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91\": container with ID starting with f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91 not found: ID does not exist" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.439017 4835 scope.go:117] "RemoveContainer" containerID="b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.439328 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4"} err="failed to get container status \"b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4\": rpc error: code = NotFound desc = could not find container \"b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4\": container with ID starting with b06cba1e043a3c20edd1678da31c656e2f90de1940b252cb5b268c6cec1188b4 not found: ID does not exist" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.439352 4835 scope.go:117] "RemoveContainer" containerID="f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.439561 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91"} err="failed to get container status \"f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91\": rpc error: code = NotFound desc = could not find container \"f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91\": container with ID starting with f9ebda8c376bbd86291765f88c13ceeced979ff69ead4051f43728b5b5a11c91 not found: ID does not exist" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.498807 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-scripts\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.498930 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8chnq\" (UniqueName: \"kubernetes.io/projected/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-kube-api-access-8chnq\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.498977 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-ceph\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.499007 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-logs\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.499033 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.500291 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-config-data\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.500321 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.500590 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.500638 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.561047 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.602283 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-config-data\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.602321 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.602404 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.602430 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.602454 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-scripts\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.602492 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8chnq\" (UniqueName: \"kubernetes.io/projected/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-kube-api-access-8chnq\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.602515 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-ceph\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.602535 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-logs\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.602552 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.604462 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-logs\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.604777 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.604948 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.607314 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-scripts\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.608457 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.610797 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-ceph\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.612876 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-config-data\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.613137 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.628406 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8chnq\" (UniqueName: \"kubernetes.io/projected/a9c14cdd-ae95-4489-a154-8b11c8c2ec87-kube-api-access-8chnq\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.661853 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a9c14cdd-ae95-4489-a154-8b11c8c2ec87\") " pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.704108 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-config-data\") pod \"af640d37-f725-423e-b60d-8c160d6574e1\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.704184 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-logs\") pod \"af640d37-f725-423e-b60d-8c160d6574e1\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.704388 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-combined-ca-bundle\") pod \"af640d37-f725-423e-b60d-8c160d6574e1\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.704428 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-scripts\") pod \"af640d37-f725-423e-b60d-8c160d6574e1\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.704948 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-httpd-run\") pod \"af640d37-f725-423e-b60d-8c160d6574e1\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.705035 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"af640d37-f725-423e-b60d-8c160d6574e1\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.705059 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgtx9\" (UniqueName: \"kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-kube-api-access-zgtx9\") pod \"af640d37-f725-423e-b60d-8c160d6574e1\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.705110 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-ceph\") pod \"af640d37-f725-423e-b60d-8c160d6574e1\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.705180 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-internal-tls-certs\") pod \"af640d37-f725-423e-b60d-8c160d6574e1\" (UID: \"af640d37-f725-423e-b60d-8c160d6574e1\") " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.705446 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-logs" (OuterVolumeSpecName: "logs") pod "af640d37-f725-423e-b60d-8c160d6574e1" (UID: "af640d37-f725-423e-b60d-8c160d6574e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.705959 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "af640d37-f725-423e-b60d-8c160d6574e1" (UID: "af640d37-f725-423e-b60d-8c160d6574e1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.706991 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.707010 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af640d37-f725-423e-b60d-8c160d6574e1-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.709531 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-ceph" (OuterVolumeSpecName: "ceph") pod "af640d37-f725-423e-b60d-8c160d6574e1" (UID: "af640d37-f725-423e-b60d-8c160d6574e1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.709763 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-kube-api-access-zgtx9" (OuterVolumeSpecName: "kube-api-access-zgtx9") pod "af640d37-f725-423e-b60d-8c160d6574e1" (UID: "af640d37-f725-423e-b60d-8c160d6574e1"). InnerVolumeSpecName "kube-api-access-zgtx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.712207 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-scripts" (OuterVolumeSpecName: "scripts") pod "af640d37-f725-423e-b60d-8c160d6574e1" (UID: "af640d37-f725-423e-b60d-8c160d6574e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.712393 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "af640d37-f725-423e-b60d-8c160d6574e1" (UID: "af640d37-f725-423e-b60d-8c160d6574e1"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.723863 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.740937 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af640d37-f725-423e-b60d-8c160d6574e1" (UID: "af640d37-f725-423e-b60d-8c160d6574e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.773377 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "af640d37-f725-423e-b60d-8c160d6574e1" (UID: "af640d37-f725-423e-b60d-8c160d6574e1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.777535 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-config-data" (OuterVolumeSpecName: "config-data") pod "af640d37-f725-423e-b60d-8c160d6574e1" (UID: "af640d37-f725-423e-b60d-8c160d6574e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.810786 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.810831 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgtx9\" (UniqueName: \"kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-kube-api-access-zgtx9\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.810848 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/af640d37-f725-423e-b60d-8c160d6574e1-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.810860 4835 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.810873 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.810885 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.810895 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af640d37-f725-423e-b60d-8c160d6574e1-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.834536 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 02 11:47:25 crc kubenswrapper[4835]: I1002 11:47:25.912308 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.280211 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd8b146-5a2c-436e-add6-5adc721b7df9" path="/var/lib/kubelet/pods/3bd8b146-5a2c-436e-add6-5adc721b7df9/volumes" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.301037 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af640d37-f725-423e-b60d-8c160d6574e1","Type":"ContainerDied","Data":"536c133e683ed5e37f7598bf790cbfec96f261eaf38e1734cf92357075e36357"} Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.301094 4835 scope.go:117] "RemoveContainer" containerID="403e898e566760a2a301ca479a67e4b6dc7d6b17b91a62d2f4972970dcce2ee8" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.301043 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.350863 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.367027 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.382631 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:47:26 crc kubenswrapper[4835]: E1002 11:47:26.383181 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af640d37-f725-423e-b60d-8c160d6574e1" containerName="glance-log" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.383196 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="af640d37-f725-423e-b60d-8c160d6574e1" containerName="glance-log" Oct 02 11:47:26 crc kubenswrapper[4835]: E1002 11:47:26.383240 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af640d37-f725-423e-b60d-8c160d6574e1" containerName="glance-httpd" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.383248 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="af640d37-f725-423e-b60d-8c160d6574e1" containerName="glance-httpd" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.383526 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="af640d37-f725-423e-b60d-8c160d6574e1" containerName="glance-httpd" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.383554 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="af640d37-f725-423e-b60d-8c160d6574e1" containerName="glance-log" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.384829 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.388790 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.389008 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.397622 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.412364 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.414593 4835 scope.go:117] "RemoveContainer" containerID="c93f3b0bb55a1100d06a4fec054595d85f98624a1994266391a552fe9b9c65a3" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.535924 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20d39210-5076-4fca-9c17-dd4b6f18220c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.535991 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.536023 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.536057 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.536100 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20d39210-5076-4fca-9c17-dd4b6f18220c-logs\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.536133 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.536162 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.536199 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/20d39210-5076-4fca-9c17-dd4b6f18220c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.536796 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pzvw\" (UniqueName: \"kubernetes.io/projected/20d39210-5076-4fca-9c17-dd4b6f18220c-kube-api-access-2pzvw\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.638189 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pzvw\" (UniqueName: \"kubernetes.io/projected/20d39210-5076-4fca-9c17-dd4b6f18220c-kube-api-access-2pzvw\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.638303 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20d39210-5076-4fca-9c17-dd4b6f18220c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.638359 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.638401 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.638440 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.638488 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20d39210-5076-4fca-9c17-dd4b6f18220c-logs\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.638522 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.639294 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.640329 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.640400 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/20d39210-5076-4fca-9c17-dd4b6f18220c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.640447 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20d39210-5076-4fca-9c17-dd4b6f18220c-logs\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.640503 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20d39210-5076-4fca-9c17-dd4b6f18220c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.644970 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.649030 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.651423 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.656406 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/20d39210-5076-4fca-9c17-dd4b6f18220c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.656991 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d39210-5076-4fca-9c17-dd4b6f18220c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.673037 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pzvw\" (UniqueName: \"kubernetes.io/projected/20d39210-5076-4fca-9c17-dd4b6f18220c-kube-api-access-2pzvw\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.680675 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"20d39210-5076-4fca-9c17-dd4b6f18220c\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:47:26 crc kubenswrapper[4835]: I1002 11:47:26.712334 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:27 crc kubenswrapper[4835]: I1002 11:47:27.335406 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9c14cdd-ae95-4489-a154-8b11c8c2ec87","Type":"ContainerStarted","Data":"bfc4d213948b581998c42d702ff6dc84d034bf320a0b218ff15452657cd8aaf7"} Oct 02 11:47:27 crc kubenswrapper[4835]: I1002 11:47:27.342928 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:47:28 crc kubenswrapper[4835]: I1002 11:47:28.278841 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af640d37-f725-423e-b60d-8c160d6574e1" path="/var/lib/kubelet/pods/af640d37-f725-423e-b60d-8c160d6574e1/volumes" Oct 02 11:47:28 crc kubenswrapper[4835]: I1002 11:47:28.352766 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9c14cdd-ae95-4489-a154-8b11c8c2ec87","Type":"ContainerStarted","Data":"a0b44ea7efb949f1128b04c2cec0e627b88401c92c3b4022c3762e7f22a3c77e"} Oct 02 11:47:30 crc kubenswrapper[4835]: I1002 11:47:30.107767 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 02 11:47:30 crc kubenswrapper[4835]: I1002 11:47:30.174153 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 02 11:47:30 crc kubenswrapper[4835]: I1002 11:47:30.315719 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-cef2-account-create-2tljw"] Oct 02 11:47:30 crc kubenswrapper[4835]: I1002 11:47:30.317422 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-cef2-account-create-2tljw" Oct 02 11:47:30 crc kubenswrapper[4835]: I1002 11:47:30.320398 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 02 11:47:30 crc kubenswrapper[4835]: I1002 11:47:30.330063 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-cef2-account-create-2tljw"] Oct 02 11:47:30 crc kubenswrapper[4835]: I1002 11:47:30.435696 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtb4p\" (UniqueName: \"kubernetes.io/projected/b115bfa6-dbfa-4ef6-841b-7f48b78fadea-kube-api-access-vtb4p\") pod \"manila-cef2-account-create-2tljw\" (UID: \"b115bfa6-dbfa-4ef6-841b-7f48b78fadea\") " pod="openstack/manila-cef2-account-create-2tljw" Oct 02 11:47:30 crc kubenswrapper[4835]: I1002 11:47:30.538613 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtb4p\" (UniqueName: \"kubernetes.io/projected/b115bfa6-dbfa-4ef6-841b-7f48b78fadea-kube-api-access-vtb4p\") pod \"manila-cef2-account-create-2tljw\" (UID: \"b115bfa6-dbfa-4ef6-841b-7f48b78fadea\") " pod="openstack/manila-cef2-account-create-2tljw" Oct 02 11:47:30 crc kubenswrapper[4835]: I1002 11:47:30.559629 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtb4p\" (UniqueName: \"kubernetes.io/projected/b115bfa6-dbfa-4ef6-841b-7f48b78fadea-kube-api-access-vtb4p\") pod \"manila-cef2-account-create-2tljw\" (UID: \"b115bfa6-dbfa-4ef6-841b-7f48b78fadea\") " pod="openstack/manila-cef2-account-create-2tljw" Oct 02 11:47:30 crc kubenswrapper[4835]: I1002 11:47:30.640761 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-cef2-account-create-2tljw" Oct 02 11:47:33 crc kubenswrapper[4835]: I1002 11:47:33.396652 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"20d39210-5076-4fca-9c17-dd4b6f18220c","Type":"ContainerStarted","Data":"8148ae0448bbf9c66a63b5d26528766b9b8c4ad30ce127b26e3e8cab48dcc862"} Oct 02 11:47:33 crc kubenswrapper[4835]: I1002 11:47:33.454143 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-cef2-account-create-2tljw"] Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.417778 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fc895cc-7hsqv" event={"ID":"52db0e3d-e5a8-4f34-ba3d-283451e3c843","Type":"ContainerStarted","Data":"13190c78b5f9862caf73c2a58ab5bb4687d33b4baa65dec3d8b323c638c71a18"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.419239 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fc895cc-7hsqv" event={"ID":"52db0e3d-e5a8-4f34-ba3d-283451e3c843","Type":"ContainerStarted","Data":"dfd817162e933f514d8fe25e1380e8506b7d499cb72b838290aa7d5e16cf3094"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.419426 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c5fc895cc-7hsqv" podUID="52db0e3d-e5a8-4f34-ba3d-283451e3c843" containerName="horizon-log" containerID="cri-o://13190c78b5f9862caf73c2a58ab5bb4687d33b4baa65dec3d8b323c638c71a18" gracePeriod=30 Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.419975 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c5fc895cc-7hsqv" podUID="52db0e3d-e5a8-4f34-ba3d-283451e3c843" containerName="horizon" containerID="cri-o://dfd817162e933f514d8fe25e1380e8506b7d499cb72b838290aa7d5e16cf3094" gracePeriod=30 Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.428011 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74c74d79b4-rfzlk" event={"ID":"37beff21-75a0-4297-a84b-9a34ccb1d2e0","Type":"ContainerStarted","Data":"3fa84eb5024fd4bca93b4d7c1edf20279eb1b53420ac17b2a4faecad1a8d2b02"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.428051 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74c74d79b4-rfzlk" event={"ID":"37beff21-75a0-4297-a84b-9a34ccb1d2e0","Type":"ContainerStarted","Data":"150a942f6a8baada3743cbe30652d6c9181e7f38db63213ad3b8f85e914cae23"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.431082 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bf6dfccc-kgdvz" event={"ID":"c9fc8d63-9b88-4a08-bada-236a3c3d1bda","Type":"ContainerStarted","Data":"3903d7743e2f1f490ae67dc66c6dadf254bdf53aa7bccc8f066295da2b6148bc"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.431123 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bf6dfccc-kgdvz" event={"ID":"c9fc8d63-9b88-4a08-bada-236a3c3d1bda","Type":"ContainerStarted","Data":"27ebfdb88a2ef7e2cd4885e1a3cf50c156740b88ea6b2f06d2811f6ba7317700"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.431266 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9bf6dfccc-kgdvz" podUID="c9fc8d63-9b88-4a08-bada-236a3c3d1bda" containerName="horizon-log" containerID="cri-o://27ebfdb88a2ef7e2cd4885e1a3cf50c156740b88ea6b2f06d2811f6ba7317700" gracePeriod=30 Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.431580 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9bf6dfccc-kgdvz" podUID="c9fc8d63-9b88-4a08-bada-236a3c3d1bda" containerName="horizon" containerID="cri-o://3903d7743e2f1f490ae67dc66c6dadf254bdf53aa7bccc8f066295da2b6148bc" gracePeriod=30 Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.440105 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"20d39210-5076-4fca-9c17-dd4b6f18220c","Type":"ContainerStarted","Data":"aaa1cf3c16e6cef63267927a5e0a7223348309c0c858da1832c6ecffd0adbf75"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.440352 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"20d39210-5076-4fca-9c17-dd4b6f18220c","Type":"ContainerStarted","Data":"e76d4de60db3b5c18fe269bc120fe34dc1ff6bf0f212027e325783b13c9a4156"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.446112 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d655558cb-84687" event={"ID":"4fd0f229-d269-4fa9-bd48-0909ce1ce941","Type":"ContainerStarted","Data":"3d615e7a6c032955988012040fbc04e392345b1f6becf426f83f3e19346c63d9"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.446154 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d655558cb-84687" event={"ID":"4fd0f229-d269-4fa9-bd48-0909ce1ce941","Type":"ContainerStarted","Data":"0a3811b6eb5665a68fd1dd279a5c7df3555bc496c0b12ed3e939dc6786402ae4"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.453946 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6c5fc895cc-7hsqv" podStartSLOduration=2.778887762 podStartE2EDuration="14.453922259s" podCreationTimestamp="2025-10-02 11:47:20 +0000 UTC" firstStartedPulling="2025-10-02 11:47:21.422057173 +0000 UTC m=+3117.981964754" lastFinishedPulling="2025-10-02 11:47:33.09709167 +0000 UTC m=+3129.656999251" observedRunningTime="2025-10-02 11:47:34.445515247 +0000 UTC m=+3131.005422828" watchObservedRunningTime="2025-10-02 11:47:34.453922259 +0000 UTC m=+3131.013829850" Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.459568 4835 generic.go:334] "Generic (PLEG): container finished" podID="b115bfa6-dbfa-4ef6-841b-7f48b78fadea" containerID="a612e14df782188cf432c28f0ba150087c2625ef8d263c73b32ddf23c7fde897" exitCode=0 Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.459787 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-cef2-account-create-2tljw" event={"ID":"b115bfa6-dbfa-4ef6-841b-7f48b78fadea","Type":"ContainerDied","Data":"a612e14df782188cf432c28f0ba150087c2625ef8d263c73b32ddf23c7fde897"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.459930 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-cef2-account-create-2tljw" event={"ID":"b115bfa6-dbfa-4ef6-841b-7f48b78fadea","Type":"ContainerStarted","Data":"e648a3b1e240da953bb636f81e9acdda349ceeaf0f36c8302c58bb171c18fc3d"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.465129 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9c14cdd-ae95-4489-a154-8b11c8c2ec87","Type":"ContainerStarted","Data":"47c639a5dee2f54b5efb117e225cfe03b085fdce4c9d30b9400817840685a749"} Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.472577 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-74c74d79b4-rfzlk" podStartSLOduration=3.09263888 podStartE2EDuration="12.472554595s" podCreationTimestamp="2025-10-02 11:47:22 +0000 UTC" firstStartedPulling="2025-10-02 11:47:23.736718807 +0000 UTC m=+3120.296626388" lastFinishedPulling="2025-10-02 11:47:33.116634522 +0000 UTC m=+3129.676542103" observedRunningTime="2025-10-02 11:47:34.466381117 +0000 UTC m=+3131.026288698" watchObservedRunningTime="2025-10-02 11:47:34.472554595 +0000 UTC m=+3131.032462176" Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.533623 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9bf6dfccc-kgdvz" podStartSLOduration=2.669760971 podStartE2EDuration="14.533593421s" podCreationTimestamp="2025-10-02 11:47:20 +0000 UTC" firstStartedPulling="2025-10-02 11:47:21.256519049 +0000 UTC m=+3117.816426630" lastFinishedPulling="2025-10-02 11:47:33.120351499 +0000 UTC m=+3129.680259080" observedRunningTime="2025-10-02 11:47:34.489592185 +0000 UTC m=+3131.049499766" watchObservedRunningTime="2025-10-02 11:47:34.533593421 +0000 UTC m=+3131.093501002" Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.562666 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.562645147 podStartE2EDuration="8.562645147s" podCreationTimestamp="2025-10-02 11:47:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:47:34.514863102 +0000 UTC m=+3131.074770703" watchObservedRunningTime="2025-10-02 11:47:34.562645147 +0000 UTC m=+3131.122552728" Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.576905 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.576888937 podStartE2EDuration="9.576888937s" podCreationTimestamp="2025-10-02 11:47:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:47:34.542926 +0000 UTC m=+3131.102833611" watchObservedRunningTime="2025-10-02 11:47:34.576888937 +0000 UTC m=+3131.136796518" Oct 02 11:47:34 crc kubenswrapper[4835]: I1002 11:47:34.597541 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d655558cb-84687" podStartSLOduration=3.523781548 podStartE2EDuration="12.597523791s" podCreationTimestamp="2025-10-02 11:47:22 +0000 UTC" firstStartedPulling="2025-10-02 11:47:24.038433871 +0000 UTC m=+3120.598341452" lastFinishedPulling="2025-10-02 11:47:33.112176114 +0000 UTC m=+3129.672083695" observedRunningTime="2025-10-02 11:47:34.587832762 +0000 UTC m=+3131.147740343" watchObservedRunningTime="2025-10-02 11:47:34.597523791 +0000 UTC m=+3131.157431372" Oct 02 11:47:35 crc kubenswrapper[4835]: I1002 11:47:35.729644 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:47:35 crc kubenswrapper[4835]: I1002 11:47:35.729955 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:47:35 crc kubenswrapper[4835]: I1002 11:47:35.763928 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:47:35 crc kubenswrapper[4835]: I1002 11:47:35.783167 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:47:35 crc kubenswrapper[4835]: I1002 11:47:35.884040 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-cef2-account-create-2tljw" Oct 02 11:47:35 crc kubenswrapper[4835]: I1002 11:47:35.967741 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtb4p\" (UniqueName: \"kubernetes.io/projected/b115bfa6-dbfa-4ef6-841b-7f48b78fadea-kube-api-access-vtb4p\") pod \"b115bfa6-dbfa-4ef6-841b-7f48b78fadea\" (UID: \"b115bfa6-dbfa-4ef6-841b-7f48b78fadea\") " Oct 02 11:47:35 crc kubenswrapper[4835]: I1002 11:47:35.974149 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b115bfa6-dbfa-4ef6-841b-7f48b78fadea-kube-api-access-vtb4p" (OuterVolumeSpecName: "kube-api-access-vtb4p") pod "b115bfa6-dbfa-4ef6-841b-7f48b78fadea" (UID: "b115bfa6-dbfa-4ef6-841b-7f48b78fadea"). InnerVolumeSpecName "kube-api-access-vtb4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:47:36 crc kubenswrapper[4835]: I1002 11:47:36.070114 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtb4p\" (UniqueName: \"kubernetes.io/projected/b115bfa6-dbfa-4ef6-841b-7f48b78fadea-kube-api-access-vtb4p\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:36 crc kubenswrapper[4835]: I1002 11:47:36.513528 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-cef2-account-create-2tljw" Oct 02 11:47:36 crc kubenswrapper[4835]: I1002 11:47:36.514159 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-cef2-account-create-2tljw" event={"ID":"b115bfa6-dbfa-4ef6-841b-7f48b78fadea","Type":"ContainerDied","Data":"e648a3b1e240da953bb636f81e9acdda349ceeaf0f36c8302c58bb171c18fc3d"} Oct 02 11:47:36 crc kubenswrapper[4835]: I1002 11:47:36.514207 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e648a3b1e240da953bb636f81e9acdda349ceeaf0f36c8302c58bb171c18fc3d" Oct 02 11:47:36 crc kubenswrapper[4835]: I1002 11:47:36.514244 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:47:36 crc kubenswrapper[4835]: I1002 11:47:36.514271 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:47:36 crc kubenswrapper[4835]: I1002 11:47:36.713012 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:36 crc kubenswrapper[4835]: I1002 11:47:36.713628 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:36 crc kubenswrapper[4835]: I1002 11:47:36.753668 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:36 crc kubenswrapper[4835]: I1002 11:47:36.762247 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:37 crc kubenswrapper[4835]: I1002 11:47:37.521425 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:37 crc kubenswrapper[4835]: I1002 11:47:37.521476 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:38 crc kubenswrapper[4835]: I1002 11:47:38.529017 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.040279 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.604915 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.610569 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-fqh6r"] Oct 02 11:47:40 crc kubenswrapper[4835]: E1002 11:47:40.611083 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b115bfa6-dbfa-4ef6-841b-7f48b78fadea" containerName="mariadb-account-create" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.611107 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b115bfa6-dbfa-4ef6-841b-7f48b78fadea" containerName="mariadb-account-create" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.611396 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b115bfa6-dbfa-4ef6-841b-7f48b78fadea" containerName="mariadb-account-create" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.612279 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.617339 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.617690 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-sxmnh" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.618783 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-fqh6r"] Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.640926 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.681972 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.696584 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-combined-ca-bundle\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.696684 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-job-config-data\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.696725 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgq4v\" (UniqueName: \"kubernetes.io/projected/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-kube-api-access-lgq4v\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.696917 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-config-data\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.713279 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.799411 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-combined-ca-bundle\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.799471 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-job-config-data\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.799493 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgq4v\" (UniqueName: \"kubernetes.io/projected/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-kube-api-access-lgq4v\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.800469 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-config-data\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.814465 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-combined-ca-bundle\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.817933 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-job-config-data\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.819965 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgq4v\" (UniqueName: \"kubernetes.io/projected/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-kube-api-access-lgq4v\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.822436 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-config-data\") pod \"manila-db-sync-fqh6r\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:40 crc kubenswrapper[4835]: I1002 11:47:40.932875 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:41 crc kubenswrapper[4835]: I1002 11:47:41.651252 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-fqh6r"] Oct 02 11:47:41 crc kubenswrapper[4835]: W1002 11:47:41.657313 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65d5cda4_bd3f_4c36_93cc_1209c49e43ee.slice/crio-dd8e28ea5b21b5e88a22d31f9b383a136c59bc394e305109426327a09659481e WatchSource:0}: Error finding container dd8e28ea5b21b5e88a22d31f9b383a136c59bc394e305109426327a09659481e: Status 404 returned error can't find the container with id dd8e28ea5b21b5e88a22d31f9b383a136c59bc394e305109426327a09659481e Oct 02 11:47:41 crc kubenswrapper[4835]: I1002 11:47:41.866007 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:47:41 crc kubenswrapper[4835]: I1002 11:47:41.983823 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:47:41 crc kubenswrapper[4835]: I1002 11:47:41.983888 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:47:41 crc kubenswrapper[4835]: I1002 11:47:41.983945 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:47:41 crc kubenswrapper[4835]: I1002 11:47:41.984859 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:47:41 crc kubenswrapper[4835]: I1002 11:47:41.984917 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" gracePeriod=600 Oct 02 11:47:42 crc kubenswrapper[4835]: E1002 11:47:42.131313 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:47:42 crc kubenswrapper[4835]: I1002 11:47:42.569905 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" exitCode=0 Oct 02 11:47:42 crc kubenswrapper[4835]: I1002 11:47:42.569982 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe"} Oct 02 11:47:42 crc kubenswrapper[4835]: I1002 11:47:42.570038 4835 scope.go:117] "RemoveContainer" containerID="0244201f52c8dd5e3922b1708549b362fa6db05645f36e77deac378419afe8ad" Oct 02 11:47:42 crc kubenswrapper[4835]: I1002 11:47:42.570865 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:47:42 crc kubenswrapper[4835]: E1002 11:47:42.571290 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:47:42 crc kubenswrapper[4835]: I1002 11:47:42.574182 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fqh6r" event={"ID":"65d5cda4-bd3f-4c36-93cc-1209c49e43ee","Type":"ContainerStarted","Data":"dd8e28ea5b21b5e88a22d31f9b383a136c59bc394e305109426327a09659481e"} Oct 02 11:47:43 crc kubenswrapper[4835]: I1002 11:47:43.238759 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:43 crc kubenswrapper[4835]: I1002 11:47:43.239053 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:47:43 crc kubenswrapper[4835]: I1002 11:47:43.240535 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-74c74d79b4-rfzlk" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Oct 02 11:47:43 crc kubenswrapper[4835]: I1002 11:47:43.414879 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:43 crc kubenswrapper[4835]: I1002 11:47:43.414940 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d655558cb-84687" Oct 02 11:47:43 crc kubenswrapper[4835]: I1002 11:47:43.417139 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6d655558cb-84687" podUID="4fd0f229-d269-4fa9-bd48-0909ce1ce941" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.244:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.244:8443: connect: connection refused" Oct 02 11:47:46 crc kubenswrapper[4835]: I1002 11:47:46.631456 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fqh6r" event={"ID":"65d5cda4-bd3f-4c36-93cc-1209c49e43ee","Type":"ContainerStarted","Data":"7e882b6662d6f77fe7d881d7c81e9c6ad4985594c1a6dbeb1ede8a2dc82299db"} Oct 02 11:47:46 crc kubenswrapper[4835]: I1002 11:47:46.654862 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-fqh6r" podStartSLOduration=2.23844043 podStartE2EDuration="6.654842299s" podCreationTimestamp="2025-10-02 11:47:40 +0000 UTC" firstStartedPulling="2025-10-02 11:47:41.65990869 +0000 UTC m=+3138.219816271" lastFinishedPulling="2025-10-02 11:47:46.076310559 +0000 UTC m=+3142.636218140" observedRunningTime="2025-10-02 11:47:46.648238589 +0000 UTC m=+3143.208146180" watchObservedRunningTime="2025-10-02 11:47:46.654842299 +0000 UTC m=+3143.214749890" Oct 02 11:47:53 crc kubenswrapper[4835]: I1002 11:47:53.238857 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-74c74d79b4-rfzlk" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Oct 02 11:47:53 crc kubenswrapper[4835]: I1002 11:47:53.252365 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:47:53 crc kubenswrapper[4835]: E1002 11:47:53.252678 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:47:53 crc kubenswrapper[4835]: I1002 11:47:53.415450 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6d655558cb-84687" podUID="4fd0f229-d269-4fa9-bd48-0909ce1ce941" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.244:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.244:8443: connect: connection refused" Oct 02 11:47:57 crc kubenswrapper[4835]: I1002 11:47:57.734308 4835 generic.go:334] "Generic (PLEG): container finished" podID="65d5cda4-bd3f-4c36-93cc-1209c49e43ee" containerID="7e882b6662d6f77fe7d881d7c81e9c6ad4985594c1a6dbeb1ede8a2dc82299db" exitCode=0 Oct 02 11:47:57 crc kubenswrapper[4835]: I1002 11:47:57.734417 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fqh6r" event={"ID":"65d5cda4-bd3f-4c36-93cc-1209c49e43ee","Type":"ContainerDied","Data":"7e882b6662d6f77fe7d881d7c81e9c6ad4985594c1a6dbeb1ede8a2dc82299db"} Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.376064 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fqh6r" Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.460969 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-combined-ca-bundle\") pod \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.461053 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-config-data\") pod \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.461116 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgq4v\" (UniqueName: \"kubernetes.io/projected/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-kube-api-access-lgq4v\") pod \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.461271 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-job-config-data\") pod \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\" (UID: \"65d5cda4-bd3f-4c36-93cc-1209c49e43ee\") " Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.469009 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-kube-api-access-lgq4v" (OuterVolumeSpecName: "kube-api-access-lgq4v") pod "65d5cda4-bd3f-4c36-93cc-1209c49e43ee" (UID: "65d5cda4-bd3f-4c36-93cc-1209c49e43ee"). InnerVolumeSpecName "kube-api-access-lgq4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.470979 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "65d5cda4-bd3f-4c36-93cc-1209c49e43ee" (UID: "65d5cda4-bd3f-4c36-93cc-1209c49e43ee"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.473766 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-config-data" (OuterVolumeSpecName: "config-data") pod "65d5cda4-bd3f-4c36-93cc-1209c49e43ee" (UID: "65d5cda4-bd3f-4c36-93cc-1209c49e43ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.512671 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65d5cda4-bd3f-4c36-93cc-1209c49e43ee" (UID: "65d5cda4-bd3f-4c36-93cc-1209c49e43ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.563409 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.563651 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.563662 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgq4v\" (UniqueName: \"kubernetes.io/projected/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-kube-api-access-lgq4v\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.563673 4835 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/65d5cda4-bd3f-4c36-93cc-1209c49e43ee-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.752589 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fqh6r" event={"ID":"65d5cda4-bd3f-4c36-93cc-1209c49e43ee","Type":"ContainerDied","Data":"dd8e28ea5b21b5e88a22d31f9b383a136c59bc394e305109426327a09659481e"} Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.752848 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd8e28ea5b21b5e88a22d31f9b383a136c59bc394e305109426327a09659481e" Oct 02 11:47:59 crc kubenswrapper[4835]: I1002 11:47:59.752644 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fqh6r" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.021726 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 11:48:00 crc kubenswrapper[4835]: E1002 11:48:00.022158 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65d5cda4-bd3f-4c36-93cc-1209c49e43ee" containerName="manila-db-sync" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.022182 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d5cda4-bd3f-4c36-93cc-1209c49e43ee" containerName="manila-db-sync" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.022459 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="65d5cda4-bd3f-4c36-93cc-1209c49e43ee" containerName="manila-db-sync" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.023583 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.026868 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.027276 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.027419 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.027980 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-sxmnh" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.036151 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.038351 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.041332 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.051661 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.066784 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.173755 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.173805 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjmd2\" (UniqueName: \"kubernetes.io/projected/04247211-0a2a-4cf7-989f-87a89e84bca6-kube-api-access-wjmd2\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.173833 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.173863 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.173888 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-ceph\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.173907 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-scripts\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.173956 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.174015 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.174082 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.174181 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g77g\" (UniqueName: \"kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-kube-api-access-8g77g\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.174207 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.174280 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-scripts\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.174337 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04247211-0a2a-4cf7-989f-87a89e84bca6-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.174366 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.180447 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-lqghn"] Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.183038 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.196908 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-lqghn"] Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.276764 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-scripts\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.277094 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04247211-0a2a-4cf7-989f-87a89e84bca6-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.277189 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.277300 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.277386 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjmd2\" (UniqueName: \"kubernetes.io/projected/04247211-0a2a-4cf7-989f-87a89e84bca6-kube-api-access-wjmd2\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.277458 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.277540 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.277735 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-ceph\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.277813 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-scripts\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.277888 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.278002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.278109 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.278238 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g77g\" (UniqueName: \"kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-kube-api-access-8g77g\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.278324 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.282797 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.285402 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.288724 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-scripts\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.289092 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04247211-0a2a-4cf7-989f-87a89e84bca6-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.290276 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.293415 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.293776 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-ceph\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.294604 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.308387 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.308937 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.309723 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.314115 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-scripts\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.317254 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g77g\" (UniqueName: \"kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-kube-api-access-8g77g\") pod \"manila-share-share1-0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.328357 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjmd2\" (UniqueName: \"kubernetes.io/projected/04247211-0a2a-4cf7-989f-87a89e84bca6-kube-api-access-wjmd2\") pod \"manila-scheduler-0\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.351153 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.379822 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.382355 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.382538 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-config\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.382722 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.382875 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxhxh\" (UniqueName: \"kubernetes.io/projected/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-kube-api-access-pxhxh\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.383019 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.383116 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.384144 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.386384 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.393872 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.409902 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489289 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-etc-machine-id\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489346 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data-custom\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489367 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k46ld\" (UniqueName: \"kubernetes.io/projected/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-kube-api-access-k46ld\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489413 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxhxh\" (UniqueName: \"kubernetes.io/projected/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-kube-api-access-pxhxh\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489456 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489482 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489503 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489532 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-logs\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489555 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-scripts\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489576 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489596 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-config\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489656 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.489677 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.490542 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.491587 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.492107 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.492652 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.493157 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-config\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.526910 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxhxh\" (UniqueName: \"kubernetes.io/projected/913bcc4c-ed8f-4b09-b645-09cbb3e7943a-kube-api-access-pxhxh\") pod \"dnsmasq-dns-76b5fdb995-lqghn\" (UID: \"913bcc4c-ed8f-4b09-b645-09cbb3e7943a\") " pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.592579 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.592656 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-logs\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.592686 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-scripts\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.592745 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.592793 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-etc-machine-id\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.592819 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data-custom\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.592837 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k46ld\" (UniqueName: \"kubernetes.io/projected/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-kube-api-access-k46ld\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.593259 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-logs\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.593378 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-etc-machine-id\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.600848 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.601664 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-scripts\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.601736 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data-custom\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.609395 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.625700 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k46ld\" (UniqueName: \"kubernetes.io/projected/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-kube-api-access-k46ld\") pod \"manila-api-0\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " pod="openstack/manila-api-0" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.818134 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:00 crc kubenswrapper[4835]: I1002 11:48:00.915866 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 11:48:01 crc kubenswrapper[4835]: I1002 11:48:01.103775 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 11:48:01 crc kubenswrapper[4835]: I1002 11:48:01.256377 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 11:48:01 crc kubenswrapper[4835]: W1002 11:48:01.258349 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1730f4f8_033a_43b9_8081_22825c549cc0.slice/crio-39da1b6a2063d520a7d0b0edc3736afc8d911cc88b123ac1b42ecb435d03f8ec WatchSource:0}: Error finding container 39da1b6a2063d520a7d0b0edc3736afc8d911cc88b123ac1b42ecb435d03f8ec: Status 404 returned error can't find the container with id 39da1b6a2063d520a7d0b0edc3736afc8d911cc88b123ac1b42ecb435d03f8ec Oct 02 11:48:01 crc kubenswrapper[4835]: I1002 11:48:01.321232 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-lqghn"] Oct 02 11:48:01 crc kubenswrapper[4835]: W1002 11:48:01.340022 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod913bcc4c_ed8f_4b09_b645_09cbb3e7943a.slice/crio-2bf1b4509573cab5d92e8d9c14054375a800d5891d2094b23c70909143bdb994 WatchSource:0}: Error finding container 2bf1b4509573cab5d92e8d9c14054375a800d5891d2094b23c70909143bdb994: Status 404 returned error can't find the container with id 2bf1b4509573cab5d92e8d9c14054375a800d5891d2094b23c70909143bdb994 Oct 02 11:48:01 crc kubenswrapper[4835]: I1002 11:48:01.776517 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1730f4f8-033a-43b9-8081-22825c549cc0","Type":"ContainerStarted","Data":"39da1b6a2063d520a7d0b0edc3736afc8d911cc88b123ac1b42ecb435d03f8ec"} Oct 02 11:48:01 crc kubenswrapper[4835]: I1002 11:48:01.778957 4835 generic.go:334] "Generic (PLEG): container finished" podID="913bcc4c-ed8f-4b09-b645-09cbb3e7943a" containerID="5500fca812c53e0f6eb4e989dfe6e79976a99aefeb258d091080f204d786d7a7" exitCode=0 Oct 02 11:48:01 crc kubenswrapper[4835]: I1002 11:48:01.779031 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" event={"ID":"913bcc4c-ed8f-4b09-b645-09cbb3e7943a","Type":"ContainerDied","Data":"5500fca812c53e0f6eb4e989dfe6e79976a99aefeb258d091080f204d786d7a7"} Oct 02 11:48:01 crc kubenswrapper[4835]: I1002 11:48:01.779059 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" event={"ID":"913bcc4c-ed8f-4b09-b645-09cbb3e7943a","Type":"ContainerStarted","Data":"2bf1b4509573cab5d92e8d9c14054375a800d5891d2094b23c70909143bdb994"} Oct 02 11:48:01 crc kubenswrapper[4835]: I1002 11:48:01.780495 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"04247211-0a2a-4cf7-989f-87a89e84bca6","Type":"ContainerStarted","Data":"33c3d098bf92aebdde60146fee051412e4991a2b6302bf546c7981101eb337b9"} Oct 02 11:48:02 crc kubenswrapper[4835]: I1002 11:48:02.306633 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 11:48:02 crc kubenswrapper[4835]: W1002 11:48:02.314051 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0696c5_0d3c_4a8d_a4d1_2fa4911072d9.slice/crio-36f8664a5bd1e30ae69acb60a57c75a3f1b6136320b4b13b398ce83d32e7f2d8 WatchSource:0}: Error finding container 36f8664a5bd1e30ae69acb60a57c75a3f1b6136320b4b13b398ce83d32e7f2d8: Status 404 returned error can't find the container with id 36f8664a5bd1e30ae69acb60a57c75a3f1b6136320b4b13b398ce83d32e7f2d8 Oct 02 11:48:02 crc kubenswrapper[4835]: I1002 11:48:02.817070 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" event={"ID":"913bcc4c-ed8f-4b09-b645-09cbb3e7943a","Type":"ContainerStarted","Data":"a349685d9376f772c48cef15891d1a6d1131743e093396003c2cd33ce8554d59"} Oct 02 11:48:02 crc kubenswrapper[4835]: I1002 11:48:02.818599 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:02 crc kubenswrapper[4835]: I1002 11:48:02.840785 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"04247211-0a2a-4cf7-989f-87a89e84bca6","Type":"ContainerStarted","Data":"3e6b25fdc23ea30691cbc0bb2c49987854cbeb6824979fa95656432e01d32b06"} Oct 02 11:48:02 crc kubenswrapper[4835]: I1002 11:48:02.850518 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9","Type":"ContainerStarted","Data":"36f8664a5bd1e30ae69acb60a57c75a3f1b6136320b4b13b398ce83d32e7f2d8"} Oct 02 11:48:02 crc kubenswrapper[4835]: I1002 11:48:02.858780 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" podStartSLOduration=2.858762943 podStartE2EDuration="2.858762943s" podCreationTimestamp="2025-10-02 11:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:48:02.842106804 +0000 UTC m=+3159.402014395" watchObservedRunningTime="2025-10-02 11:48:02.858762943 +0000 UTC m=+3159.418670514" Oct 02 11:48:02 crc kubenswrapper[4835]: I1002 11:48:02.880515 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 02 11:48:03 crc kubenswrapper[4835]: I1002 11:48:03.862689 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"04247211-0a2a-4cf7-989f-87a89e84bca6","Type":"ContainerStarted","Data":"ab6fff716d7a0f23dcabc2acd1c93bf49365fcc2e306559382d8c5f929a29c18"} Oct 02 11:48:03 crc kubenswrapper[4835]: I1002 11:48:03.865081 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9","Type":"ContainerStarted","Data":"5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839"} Oct 02 11:48:03 crc kubenswrapper[4835]: I1002 11:48:03.865137 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9","Type":"ContainerStarted","Data":"8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e"} Oct 02 11:48:03 crc kubenswrapper[4835]: I1002 11:48:03.865404 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" containerName="manila-api-log" containerID="cri-o://8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e" gracePeriod=30 Oct 02 11:48:03 crc kubenswrapper[4835]: I1002 11:48:03.865463 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 02 11:48:03 crc kubenswrapper[4835]: I1002 11:48:03.865514 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" containerName="manila-api" containerID="cri-o://5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839" gracePeriod=30 Oct 02 11:48:03 crc kubenswrapper[4835]: I1002 11:48:03.925362 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.227393582 podStartE2EDuration="4.925339058s" podCreationTimestamp="2025-10-02 11:47:59 +0000 UTC" firstStartedPulling="2025-10-02 11:48:01.110604883 +0000 UTC m=+3157.670512464" lastFinishedPulling="2025-10-02 11:48:01.808550359 +0000 UTC m=+3158.368457940" observedRunningTime="2025-10-02 11:48:03.896974141 +0000 UTC m=+3160.456881722" watchObservedRunningTime="2025-10-02 11:48:03.925339058 +0000 UTC m=+3160.485246659" Oct 02 11:48:03 crc kubenswrapper[4835]: I1002 11:48:03.934375 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.934349497 podStartE2EDuration="3.934349497s" podCreationTimestamp="2025-10-02 11:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:48:03.920925151 +0000 UTC m=+3160.480832732" watchObservedRunningTime="2025-10-02 11:48:03.934349497 +0000 UTC m=+3160.494257078" Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.860984 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.894542 4835 generic.go:334] "Generic (PLEG): container finished" podID="52db0e3d-e5a8-4f34-ba3d-283451e3c843" containerID="dfd817162e933f514d8fe25e1380e8506b7d499cb72b838290aa7d5e16cf3094" exitCode=137 Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.894596 4835 generic.go:334] "Generic (PLEG): container finished" podID="52db0e3d-e5a8-4f34-ba3d-283451e3c843" containerID="13190c78b5f9862caf73c2a58ab5bb4687d33b4baa65dec3d8b323c638c71a18" exitCode=137 Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.894658 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fc895cc-7hsqv" event={"ID":"52db0e3d-e5a8-4f34-ba3d-283451e3c843","Type":"ContainerDied","Data":"dfd817162e933f514d8fe25e1380e8506b7d499cb72b838290aa7d5e16cf3094"} Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.894685 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fc895cc-7hsqv" event={"ID":"52db0e3d-e5a8-4f34-ba3d-283451e3c843","Type":"ContainerDied","Data":"13190c78b5f9862caf73c2a58ab5bb4687d33b4baa65dec3d8b323c638c71a18"} Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.901088 4835 generic.go:334] "Generic (PLEG): container finished" podID="c9fc8d63-9b88-4a08-bada-236a3c3d1bda" containerID="3903d7743e2f1f490ae67dc66c6dadf254bdf53aa7bccc8f066295da2b6148bc" exitCode=137 Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.901132 4835 generic.go:334] "Generic (PLEG): container finished" podID="c9fc8d63-9b88-4a08-bada-236a3c3d1bda" containerID="27ebfdb88a2ef7e2cd4885e1a3cf50c156740b88ea6b2f06d2811f6ba7317700" exitCode=137 Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.901170 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bf6dfccc-kgdvz" event={"ID":"c9fc8d63-9b88-4a08-bada-236a3c3d1bda","Type":"ContainerDied","Data":"3903d7743e2f1f490ae67dc66c6dadf254bdf53aa7bccc8f066295da2b6148bc"} Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.901243 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bf6dfccc-kgdvz" event={"ID":"c9fc8d63-9b88-4a08-bada-236a3c3d1bda","Type":"ContainerDied","Data":"27ebfdb88a2ef7e2cd4885e1a3cf50c156740b88ea6b2f06d2811f6ba7317700"} Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.902884 4835 generic.go:334] "Generic (PLEG): container finished" podID="5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" containerID="5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839" exitCode=0 Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.902914 4835 generic.go:334] "Generic (PLEG): container finished" podID="5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" containerID="8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e" exitCode=143 Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.903350 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9","Type":"ContainerDied","Data":"5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839"} Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.903381 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.903396 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9","Type":"ContainerDied","Data":"8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e"} Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.903409 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9","Type":"ContainerDied","Data":"36f8664a5bd1e30ae69acb60a57c75a3f1b6136320b4b13b398ce83d32e7f2d8"} Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.903428 4835 scope.go:117] "RemoveContainer" containerID="5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839" Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.987138 4835 scope.go:117] "RemoveContainer" containerID="8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e" Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.994294 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-logs\") pod \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.994343 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-combined-ca-bundle\") pod \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.994496 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data-custom\") pod \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.994545 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k46ld\" (UniqueName: \"kubernetes.io/projected/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-kube-api-access-k46ld\") pod \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.994619 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-scripts\") pod \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.994733 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-etc-machine-id\") pod \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.994766 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data\") pod \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\" (UID: \"5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9\") " Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.995363 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" (UID: "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.995471 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:04 crc kubenswrapper[4835]: I1002 11:48:04.995791 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-logs" (OuterVolumeSpecName: "logs") pod "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" (UID: "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.005417 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-kube-api-access-k46ld" (OuterVolumeSpecName: "kube-api-access-k46ld") pod "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" (UID: "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9"). InnerVolumeSpecName "kube-api-access-k46ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.017444 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" (UID: "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.022682 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-scripts" (OuterVolumeSpecName: "scripts") pod "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" (UID: "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.058549 4835 scope.go:117] "RemoveContainer" containerID="5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.062817 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" (UID: "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: E1002 11:48:05.072127 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839\": container with ID starting with 5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839 not found: ID does not exist" containerID="5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.072170 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839"} err="failed to get container status \"5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839\": rpc error: code = NotFound desc = could not find container \"5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839\": container with ID starting with 5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839 not found: ID does not exist" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.072194 4835 scope.go:117] "RemoveContainer" containerID="8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e" Oct 02 11:48:05 crc kubenswrapper[4835]: E1002 11:48:05.073067 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e\": container with ID starting with 8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e not found: ID does not exist" containerID="8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.073105 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e"} err="failed to get container status \"8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e\": rpc error: code = NotFound desc = could not find container \"8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e\": container with ID starting with 8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e not found: ID does not exist" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.073689 4835 scope.go:117] "RemoveContainer" containerID="5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.074029 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839"} err="failed to get container status \"5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839\": rpc error: code = NotFound desc = could not find container \"5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839\": container with ID starting with 5ab7b99a08f0d08461823f859ae56f400eb2c86762aabb72de3a5b86f8b05839 not found: ID does not exist" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.074056 4835 scope.go:117] "RemoveContainer" containerID="8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.074379 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e"} err="failed to get container status \"8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e\": rpc error: code = NotFound desc = could not find container \"8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e\": container with ID starting with 8aae97d02338e62783b7e942aa1fcc70f16f2b400323d8f4bb82e958d59afe6e not found: ID does not exist" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.098469 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.098508 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.098521 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.098533 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k46ld\" (UniqueName: \"kubernetes.io/projected/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-kube-api-access-k46ld\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.098546 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.154553 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.165365 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data" (OuterVolumeSpecName: "config-data") pod "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" (UID: "5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.200844 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.206116 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.254841 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:48:05 crc kubenswrapper[4835]: E1002 11:48:05.255360 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.257955 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.276680 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.283956 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 02 11:48:05 crc kubenswrapper[4835]: E1002 11:48:05.284464 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52db0e3d-e5a8-4f34-ba3d-283451e3c843" containerName="horizon" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284482 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="52db0e3d-e5a8-4f34-ba3d-283451e3c843" containerName="horizon" Oct 02 11:48:05 crc kubenswrapper[4835]: E1002 11:48:05.284497 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" containerName="manila-api" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284506 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" containerName="manila-api" Oct 02 11:48:05 crc kubenswrapper[4835]: E1002 11:48:05.284522 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9fc8d63-9b88-4a08-bada-236a3c3d1bda" containerName="horizon-log" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284530 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9fc8d63-9b88-4a08-bada-236a3c3d1bda" containerName="horizon-log" Oct 02 11:48:05 crc kubenswrapper[4835]: E1002 11:48:05.284545 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52db0e3d-e5a8-4f34-ba3d-283451e3c843" containerName="horizon-log" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284554 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="52db0e3d-e5a8-4f34-ba3d-283451e3c843" containerName="horizon-log" Oct 02 11:48:05 crc kubenswrapper[4835]: E1002 11:48:05.284567 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9fc8d63-9b88-4a08-bada-236a3c3d1bda" containerName="horizon" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284572 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9fc8d63-9b88-4a08-bada-236a3c3d1bda" containerName="horizon" Oct 02 11:48:05 crc kubenswrapper[4835]: E1002 11:48:05.284585 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" containerName="manila-api-log" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284592 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" containerName="manila-api-log" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284783 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9fc8d63-9b88-4a08-bada-236a3c3d1bda" containerName="horizon-log" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284803 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9fc8d63-9b88-4a08-bada-236a3c3d1bda" containerName="horizon" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284817 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="52db0e3d-e5a8-4f34-ba3d-283451e3c843" containerName="horizon" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284832 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" containerName="manila-api-log" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284847 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="52db0e3d-e5a8-4f34-ba3d-283451e3c843" containerName="horizon-log" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.284862 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" containerName="manila-api" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.286082 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.290902 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.291285 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.291440 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.302377 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-logs\") pod \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.302434 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-scripts\") pod \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.302481 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-horizon-secret-key\") pod \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.302538 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-config-data\") pod \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.302596 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l55qv\" (UniqueName: \"kubernetes.io/projected/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-kube-api-access-l55qv\") pod \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.302672 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnc5n\" (UniqueName: \"kubernetes.io/projected/52db0e3d-e5a8-4f34-ba3d-283451e3c843-kube-api-access-wnc5n\") pod \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.302709 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52db0e3d-e5a8-4f34-ba3d-283451e3c843-logs\") pod \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.302727 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-config-data\") pod \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.302749 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-scripts\") pod \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\" (UID: \"c9fc8d63-9b88-4a08-bada-236a3c3d1bda\") " Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.302782 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/52db0e3d-e5a8-4f34-ba3d-283451e3c843-horizon-secret-key\") pod \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\" (UID: \"52db0e3d-e5a8-4f34-ba3d-283451e3c843\") " Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.305331 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52db0e3d-e5a8-4f34-ba3d-283451e3c843-logs" (OuterVolumeSpecName: "logs") pod "52db0e3d-e5a8-4f34-ba3d-283451e3c843" (UID: "52db0e3d-e5a8-4f34-ba3d-283451e3c843"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.306216 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-logs" (OuterVolumeSpecName: "logs") pod "c9fc8d63-9b88-4a08-bada-236a3c3d1bda" (UID: "c9fc8d63-9b88-4a08-bada-236a3c3d1bda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.315082 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.316792 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c9fc8d63-9b88-4a08-bada-236a3c3d1bda" (UID: "c9fc8d63-9b88-4a08-bada-236a3c3d1bda"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.321007 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52db0e3d-e5a8-4f34-ba3d-283451e3c843-kube-api-access-wnc5n" (OuterVolumeSpecName: "kube-api-access-wnc5n") pod "52db0e3d-e5a8-4f34-ba3d-283451e3c843" (UID: "52db0e3d-e5a8-4f34-ba3d-283451e3c843"). InnerVolumeSpecName "kube-api-access-wnc5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.321913 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52db0e3d-e5a8-4f34-ba3d-283451e3c843-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "52db0e3d-e5a8-4f34-ba3d-283451e3c843" (UID: "52db0e3d-e5a8-4f34-ba3d-283451e3c843"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.322309 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-kube-api-access-l55qv" (OuterVolumeSpecName: "kube-api-access-l55qv") pod "c9fc8d63-9b88-4a08-bada-236a3c3d1bda" (UID: "c9fc8d63-9b88-4a08-bada-236a3c3d1bda"). InnerVolumeSpecName "kube-api-access-l55qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.337864 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-config-data" (OuterVolumeSpecName: "config-data") pod "52db0e3d-e5a8-4f34-ba3d-283451e3c843" (UID: "52db0e3d-e5a8-4f34-ba3d-283451e3c843"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.343286 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-config-data" (OuterVolumeSpecName: "config-data") pod "c9fc8d63-9b88-4a08-bada-236a3c3d1bda" (UID: "c9fc8d63-9b88-4a08-bada-236a3c3d1bda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.359966 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-scripts" (OuterVolumeSpecName: "scripts") pod "c9fc8d63-9b88-4a08-bada-236a3c3d1bda" (UID: "c9fc8d63-9b88-4a08-bada-236a3c3d1bda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.361610 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-scripts" (OuterVolumeSpecName: "scripts") pod "52db0e3d-e5a8-4f34-ba3d-283451e3c843" (UID: "52db0e3d-e5a8-4f34-ba3d-283451e3c843"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.405111 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2t55\" (UniqueName: \"kubernetes.io/projected/80b69826-9737-464b-a39c-1b853ed917db-kube-api-access-n2t55\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.405349 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-scripts\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.405413 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80b69826-9737-464b-a39c-1b853ed917db-etc-machine-id\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.406172 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-internal-tls-certs\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.406315 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-public-tls-certs\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.406739 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-config-data\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.406786 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80b69826-9737-464b-a39c-1b853ed917db-logs\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.406960 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.407053 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-config-data-custom\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.407448 4835 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/52db0e3d-e5a8-4f34-ba3d-283451e3c843-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.407781 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.410777 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.410790 4835 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.410801 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52db0e3d-e5a8-4f34-ba3d-283451e3c843-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.411249 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l55qv\" (UniqueName: \"kubernetes.io/projected/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-kube-api-access-l55qv\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.411265 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnc5n\" (UniqueName: \"kubernetes.io/projected/52db0e3d-e5a8-4f34-ba3d-283451e3c843-kube-api-access-wnc5n\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.411274 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52db0e3d-e5a8-4f34-ba3d-283451e3c843-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.411286 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.411295 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9fc8d63-9b88-4a08-bada-236a3c3d1bda-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.513436 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80b69826-9737-464b-a39c-1b853ed917db-logs\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.513500 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.513734 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-config-data-custom\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.513780 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2t55\" (UniqueName: \"kubernetes.io/projected/80b69826-9737-464b-a39c-1b853ed917db-kube-api-access-n2t55\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.513812 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-scripts\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.513839 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80b69826-9737-464b-a39c-1b853ed917db-etc-machine-id\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.513919 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-internal-tls-certs\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.513957 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-public-tls-certs\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.513993 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-config-data\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.513995 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80b69826-9737-464b-a39c-1b853ed917db-etc-machine-id\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.514105 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80b69826-9737-464b-a39c-1b853ed917db-logs\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.521738 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-scripts\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.522291 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-internal-tls-certs\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.522358 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.523010 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-config-data\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.524892 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-config-data-custom\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.535859 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b69826-9737-464b-a39c-1b853ed917db-public-tls-certs\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.536047 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2t55\" (UniqueName: \"kubernetes.io/projected/80b69826-9737-464b-a39c-1b853ed917db-kube-api-access-n2t55\") pod \"manila-api-0\" (UID: \"80b69826-9737-464b-a39c-1b853ed917db\") " pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.617390 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.930316 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fc895cc-7hsqv" event={"ID":"52db0e3d-e5a8-4f34-ba3d-283451e3c843","Type":"ContainerDied","Data":"fd5cd0c743c491d416cd02aa194d947d3a61ddf909d209ec6ab4804b4cfca08b"} Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.930837 4835 scope.go:117] "RemoveContainer" containerID="dfd817162e933f514d8fe25e1380e8506b7d499cb72b838290aa7d5e16cf3094" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.931064 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c5fc895cc-7hsqv" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.937650 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bf6dfccc-kgdvz" event={"ID":"c9fc8d63-9b88-4a08-bada-236a3c3d1bda","Type":"ContainerDied","Data":"cdf1d981236eea9b2fb3abc531e8dd682c4e549a8e8541a3806c860103437a08"} Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.937732 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9bf6dfccc-kgdvz" Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.983293 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9bf6dfccc-kgdvz"] Oct 02 11:48:05 crc kubenswrapper[4835]: I1002 11:48:05.998726 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9bf6dfccc-kgdvz"] Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.010341 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c5fc895cc-7hsqv"] Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.021815 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c5fc895cc-7hsqv"] Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.049658 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.123314 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6d655558cb-84687" Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.240528 4835 scope.go:117] "RemoveContainer" containerID="13190c78b5f9862caf73c2a58ab5bb4687d33b4baa65dec3d8b323c638c71a18" Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.272510 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52db0e3d-e5a8-4f34-ba3d-283451e3c843" path="/var/lib/kubelet/pods/52db0e3d-e5a8-4f34-ba3d-283451e3c843/volumes" Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.273716 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9" path="/var/lib/kubelet/pods/5e0696c5-0d3c-4a8d-a4d1-2fa4911072d9/volumes" Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.274907 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9fc8d63-9b88-4a08-bada-236a3c3d1bda" path="/var/lib/kubelet/pods/c9fc8d63-9b88-4a08-bada-236a3c3d1bda/volumes" Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.276426 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.289679 4835 scope.go:117] "RemoveContainer" containerID="3903d7743e2f1f490ae67dc66c6dadf254bdf53aa7bccc8f066295da2b6148bc" Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.499325 4835 scope.go:117] "RemoveContainer" containerID="27ebfdb88a2ef7e2cd4885e1a3cf50c156740b88ea6b2f06d2811f6ba7317700" Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.526189 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.526456 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="ceilometer-central-agent" containerID="cri-o://0c5061c706e1924d68bd435577b3a879924ba6a43d453f8054d879c4d2769b1c" gracePeriod=30 Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.526841 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="proxy-httpd" containerID="cri-o://75822ef3154de0aba12eb764accc619ff70ad5870583ee58753a92f4e7648898" gracePeriod=30 Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.526890 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="sg-core" containerID="cri-o://1fabad7a7dc9d9d6c217458ff6c6d9d451e0d3bf0b14f2b952653e46833449bd" gracePeriod=30 Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.526922 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="ceilometer-notification-agent" containerID="cri-o://c5f41bf19843de65c7058a557f1bb07691dfa9cbab75c48bfb2e354fa6c428d4" gracePeriod=30 Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.971773 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"80b69826-9737-464b-a39c-1b853ed917db","Type":"ContainerStarted","Data":"6884a0af6479d697b5544b79fa2a913f869eab944c55dc95b9c6074abb79e995"} Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.973145 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"80b69826-9737-464b-a39c-1b853ed917db","Type":"ContainerStarted","Data":"8209a177f780e3891c8f55b3ba97801ebaa3613da318881413b9c9428f359881"} Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.985803 4835 generic.go:334] "Generic (PLEG): container finished" podID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerID="75822ef3154de0aba12eb764accc619ff70ad5870583ee58753a92f4e7648898" exitCode=0 Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.985847 4835 generic.go:334] "Generic (PLEG): container finished" podID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerID="1fabad7a7dc9d9d6c217458ff6c6d9d451e0d3bf0b14f2b952653e46833449bd" exitCode=2 Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.985914 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9bd12da-7335-4fac-84d4-36e3f674a435","Type":"ContainerDied","Data":"75822ef3154de0aba12eb764accc619ff70ad5870583ee58753a92f4e7648898"} Oct 02 11:48:06 crc kubenswrapper[4835]: I1002 11:48:06.985947 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9bd12da-7335-4fac-84d4-36e3f674a435","Type":"ContainerDied","Data":"1fabad7a7dc9d9d6c217458ff6c6d9d451e0d3bf0b14f2b952653e46833449bd"} Oct 02 11:48:07 crc kubenswrapper[4835]: I1002 11:48:07.869855 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:48:07 crc kubenswrapper[4835]: I1002 11:48:07.999104 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"80b69826-9737-464b-a39c-1b853ed917db","Type":"ContainerStarted","Data":"f9ef6fd4f173d61f0c10f672f064391178c39ae5c6e6854797f7af60278c59f7"} Oct 02 11:48:08 crc kubenswrapper[4835]: I1002 11:48:08.000247 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 02 11:48:08 crc kubenswrapper[4835]: I1002 11:48:08.005783 4835 generic.go:334] "Generic (PLEG): container finished" podID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerID="0c5061c706e1924d68bd435577b3a879924ba6a43d453f8054d879c4d2769b1c" exitCode=0 Oct 02 11:48:08 crc kubenswrapper[4835]: I1002 11:48:08.005827 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9bd12da-7335-4fac-84d4-36e3f674a435","Type":"ContainerDied","Data":"0c5061c706e1924d68bd435577b3a879924ba6a43d453f8054d879c4d2769b1c"} Oct 02 11:48:08 crc kubenswrapper[4835]: I1002 11:48:08.019538 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6d655558cb-84687" Oct 02 11:48:08 crc kubenswrapper[4835]: I1002 11:48:08.028274 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.028211485 podStartE2EDuration="3.028211485s" podCreationTimestamp="2025-10-02 11:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:48:08.024194279 +0000 UTC m=+3164.584101890" watchObservedRunningTime="2025-10-02 11:48:08.028211485 +0000 UTC m=+3164.588119066" Oct 02 11:48:08 crc kubenswrapper[4835]: I1002 11:48:08.089108 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74c74d79b4-rfzlk"] Oct 02 11:48:08 crc kubenswrapper[4835]: I1002 11:48:08.089546 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-74c74d79b4-rfzlk" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon-log" containerID="cri-o://150a942f6a8baada3743cbe30652d6c9181e7f38db63213ad3b8f85e914cae23" gracePeriod=30 Oct 02 11:48:08 crc kubenswrapper[4835]: I1002 11:48:08.089704 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-74c74d79b4-rfzlk" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon" containerID="cri-o://3fa84eb5024fd4bca93b4d7c1edf20279eb1b53420ac17b2a4faecad1a8d2b02" gracePeriod=30 Oct 02 11:48:10 crc kubenswrapper[4835]: I1002 11:48:10.382595 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 02 11:48:10 crc kubenswrapper[4835]: I1002 11:48:10.819528 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-lqghn" Oct 02 11:48:10 crc kubenswrapper[4835]: I1002 11:48:10.914423 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-dsbm5"] Oct 02 11:48:10 crc kubenswrapper[4835]: I1002 11:48:10.915065 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" podUID="b8894eb8-5eee-48da-81e5-1a98616c3a1f" containerName="dnsmasq-dns" containerID="cri-o://5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5" gracePeriod=10 Oct 02 11:48:11 crc kubenswrapper[4835]: I1002 11:48:11.034651 4835 generic.go:334] "Generic (PLEG): container finished" podID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerID="c5f41bf19843de65c7058a557f1bb07691dfa9cbab75c48bfb2e354fa6c428d4" exitCode=0 Oct 02 11:48:11 crc kubenswrapper[4835]: I1002 11:48:11.034700 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9bd12da-7335-4fac-84d4-36e3f674a435","Type":"ContainerDied","Data":"c5f41bf19843de65c7058a557f1bb07691dfa9cbab75c48bfb2e354fa6c428d4"} Oct 02 11:48:11 crc kubenswrapper[4835]: I1002 11:48:11.909930 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.046178 4835 generic.go:334] "Generic (PLEG): container finished" podID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerID="3fa84eb5024fd4bca93b4d7c1edf20279eb1b53420ac17b2a4faecad1a8d2b02" exitCode=0 Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.046259 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74c74d79b4-rfzlk" event={"ID":"37beff21-75a0-4297-a84b-9a34ccb1d2e0","Type":"ContainerDied","Data":"3fa84eb5024fd4bca93b4d7c1edf20279eb1b53420ac17b2a4faecad1a8d2b02"} Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.049202 4835 generic.go:334] "Generic (PLEG): container finished" podID="b8894eb8-5eee-48da-81e5-1a98616c3a1f" containerID="5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5" exitCode=0 Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.049268 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" event={"ID":"b8894eb8-5eee-48da-81e5-1a98616c3a1f","Type":"ContainerDied","Data":"5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5"} Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.049311 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.049334 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-dsbm5" event={"ID":"b8894eb8-5eee-48da-81e5-1a98616c3a1f","Type":"ContainerDied","Data":"866622bce53fe65b2312ba2645020fe7dc422b1fa0a1350fd81890df1861e4b3"} Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.049356 4835 scope.go:117] "RemoveContainer" containerID="5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.069582 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-dns-svc\") pod \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.069629 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-nb\") pod \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.069740 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-sb\") pod \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.069766 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qtgh\" (UniqueName: \"kubernetes.io/projected/b8894eb8-5eee-48da-81e5-1a98616c3a1f-kube-api-access-9qtgh\") pod \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.069801 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-config\") pod \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.069823 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-openstack-edpm-ipam\") pod \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\" (UID: \"b8894eb8-5eee-48da-81e5-1a98616c3a1f\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.073855 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8894eb8-5eee-48da-81e5-1a98616c3a1f-kube-api-access-9qtgh" (OuterVolumeSpecName: "kube-api-access-9qtgh") pod "b8894eb8-5eee-48da-81e5-1a98616c3a1f" (UID: "b8894eb8-5eee-48da-81e5-1a98616c3a1f"). InnerVolumeSpecName "kube-api-access-9qtgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.077753 4835 scope.go:117] "RemoveContainer" containerID="2456a3fc2369098823c87164d5c8a112e3c27881eee3331056b3a0d6572025ab" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.098436 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.129034 4835 scope.go:117] "RemoveContainer" containerID="5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.139780 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-config" (OuterVolumeSpecName: "config") pod "b8894eb8-5eee-48da-81e5-1a98616c3a1f" (UID: "b8894eb8-5eee-48da-81e5-1a98616c3a1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: E1002 11:48:12.144306 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5\": container with ID starting with 5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5 not found: ID does not exist" containerID="5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.144360 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5"} err="failed to get container status \"5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5\": rpc error: code = NotFound desc = could not find container \"5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5\": container with ID starting with 5cdbcd3e4eae63876e024af166d5ea898f4884e0ed57e942c221ea1a30c855a5 not found: ID does not exist" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.144386 4835 scope.go:117] "RemoveContainer" containerID="2456a3fc2369098823c87164d5c8a112e3c27881eee3331056b3a0d6572025ab" Oct 02 11:48:12 crc kubenswrapper[4835]: E1002 11:48:12.145492 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2456a3fc2369098823c87164d5c8a112e3c27881eee3331056b3a0d6572025ab\": container with ID starting with 2456a3fc2369098823c87164d5c8a112e3c27881eee3331056b3a0d6572025ab not found: ID does not exist" containerID="2456a3fc2369098823c87164d5c8a112e3c27881eee3331056b3a0d6572025ab" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.145544 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2456a3fc2369098823c87164d5c8a112e3c27881eee3331056b3a0d6572025ab"} err="failed to get container status \"2456a3fc2369098823c87164d5c8a112e3c27881eee3331056b3a0d6572025ab\": rpc error: code = NotFound desc = could not find container \"2456a3fc2369098823c87164d5c8a112e3c27881eee3331056b3a0d6572025ab\": container with ID starting with 2456a3fc2369098823c87164d5c8a112e3c27881eee3331056b3a0d6572025ab not found: ID does not exist" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.171992 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qtgh\" (UniqueName: \"kubernetes.io/projected/b8894eb8-5eee-48da-81e5-1a98616c3a1f-kube-api-access-9qtgh\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.172019 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.175451 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "b8894eb8-5eee-48da-81e5-1a98616c3a1f" (UID: "b8894eb8-5eee-48da-81e5-1a98616c3a1f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.195358 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b8894eb8-5eee-48da-81e5-1a98616c3a1f" (UID: "b8894eb8-5eee-48da-81e5-1a98616c3a1f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.196802 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8894eb8-5eee-48da-81e5-1a98616c3a1f" (UID: "b8894eb8-5eee-48da-81e5-1a98616c3a1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.212195 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b8894eb8-5eee-48da-81e5-1a98616c3a1f" (UID: "b8894eb8-5eee-48da-81e5-1a98616c3a1f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.272910 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-log-httpd\") pod \"e9bd12da-7335-4fac-84d4-36e3f674a435\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.272977 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-run-httpd\") pod \"e9bd12da-7335-4fac-84d4-36e3f674a435\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.273010 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-ceilometer-tls-certs\") pod \"e9bd12da-7335-4fac-84d4-36e3f674a435\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.273069 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv5qf\" (UniqueName: \"kubernetes.io/projected/e9bd12da-7335-4fac-84d4-36e3f674a435-kube-api-access-kv5qf\") pod \"e9bd12da-7335-4fac-84d4-36e3f674a435\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.273154 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-config-data\") pod \"e9bd12da-7335-4fac-84d4-36e3f674a435\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.273236 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-scripts\") pod \"e9bd12da-7335-4fac-84d4-36e3f674a435\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.273264 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-combined-ca-bundle\") pod \"e9bd12da-7335-4fac-84d4-36e3f674a435\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.273356 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-sg-core-conf-yaml\") pod \"e9bd12da-7335-4fac-84d4-36e3f674a435\" (UID: \"e9bd12da-7335-4fac-84d4-36e3f674a435\") " Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.273865 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.273889 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.273903 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.273915 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b8894eb8-5eee-48da-81e5-1a98616c3a1f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.273992 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e9bd12da-7335-4fac-84d4-36e3f674a435" (UID: "e9bd12da-7335-4fac-84d4-36e3f674a435"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.274276 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e9bd12da-7335-4fac-84d4-36e3f674a435" (UID: "e9bd12da-7335-4fac-84d4-36e3f674a435"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.278385 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-scripts" (OuterVolumeSpecName: "scripts") pod "e9bd12da-7335-4fac-84d4-36e3f674a435" (UID: "e9bd12da-7335-4fac-84d4-36e3f674a435"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.278723 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9bd12da-7335-4fac-84d4-36e3f674a435-kube-api-access-kv5qf" (OuterVolumeSpecName: "kube-api-access-kv5qf") pod "e9bd12da-7335-4fac-84d4-36e3f674a435" (UID: "e9bd12da-7335-4fac-84d4-36e3f674a435"). InnerVolumeSpecName "kube-api-access-kv5qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.308254 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e9bd12da-7335-4fac-84d4-36e3f674a435" (UID: "e9bd12da-7335-4fac-84d4-36e3f674a435"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.341594 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e9bd12da-7335-4fac-84d4-36e3f674a435" (UID: "e9bd12da-7335-4fac-84d4-36e3f674a435"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.365341 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9bd12da-7335-4fac-84d4-36e3f674a435" (UID: "e9bd12da-7335-4fac-84d4-36e3f674a435"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.379511 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.379545 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.379556 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.379564 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.379572 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9bd12da-7335-4fac-84d4-36e3f674a435-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.379580 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.379588 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv5qf\" (UniqueName: \"kubernetes.io/projected/e9bd12da-7335-4fac-84d4-36e3f674a435-kube-api-access-kv5qf\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.380409 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-dsbm5"] Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.388188 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-dsbm5"] Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.423983 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-config-data" (OuterVolumeSpecName: "config-data") pod "e9bd12da-7335-4fac-84d4-36e3f674a435" (UID: "e9bd12da-7335-4fac-84d4-36e3f674a435"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:12 crc kubenswrapper[4835]: I1002 11:48:12.481901 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bd12da-7335-4fac-84d4-36e3f674a435-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.060185 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9bd12da-7335-4fac-84d4-36e3f674a435","Type":"ContainerDied","Data":"b3c6629dde6c9f7c2f8657e6a652fe3a96a91efa411f74760093e9d8106b584f"} Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.060780 4835 scope.go:117] "RemoveContainer" containerID="75822ef3154de0aba12eb764accc619ff70ad5870583ee58753a92f4e7648898" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.060239 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.061734 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1730f4f8-033a-43b9-8081-22825c549cc0","Type":"ContainerStarted","Data":"a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991"} Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.061777 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1730f4f8-033a-43b9-8081-22825c549cc0","Type":"ContainerStarted","Data":"24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255"} Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.096517 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.668019054 podStartE2EDuration="13.096500856s" podCreationTimestamp="2025-10-02 11:48:00 +0000 UTC" firstStartedPulling="2025-10-02 11:48:01.263584205 +0000 UTC m=+3157.823491786" lastFinishedPulling="2025-10-02 11:48:11.692065997 +0000 UTC m=+3168.251973588" observedRunningTime="2025-10-02 11:48:13.089081362 +0000 UTC m=+3169.648988943" watchObservedRunningTime="2025-10-02 11:48:13.096500856 +0000 UTC m=+3169.656408437" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.114202 4835 scope.go:117] "RemoveContainer" containerID="1fabad7a7dc9d9d6c217458ff6c6d9d451e0d3bf0b14f2b952653e46833449bd" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.115553 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.131512 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.154378 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:48:13 crc kubenswrapper[4835]: E1002 11:48:13.154796 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="proxy-httpd" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.154814 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="proxy-httpd" Oct 02 11:48:13 crc kubenswrapper[4835]: E1002 11:48:13.154829 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8894eb8-5eee-48da-81e5-1a98616c3a1f" containerName="init" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.154838 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8894eb8-5eee-48da-81e5-1a98616c3a1f" containerName="init" Oct 02 11:48:13 crc kubenswrapper[4835]: E1002 11:48:13.154847 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8894eb8-5eee-48da-81e5-1a98616c3a1f" containerName="dnsmasq-dns" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.154854 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8894eb8-5eee-48da-81e5-1a98616c3a1f" containerName="dnsmasq-dns" Oct 02 11:48:13 crc kubenswrapper[4835]: E1002 11:48:13.154870 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="ceilometer-notification-agent" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.154876 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="ceilometer-notification-agent" Oct 02 11:48:13 crc kubenswrapper[4835]: E1002 11:48:13.154893 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="sg-core" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.154899 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="sg-core" Oct 02 11:48:13 crc kubenswrapper[4835]: E1002 11:48:13.154944 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="ceilometer-central-agent" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.154950 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="ceilometer-central-agent" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.155145 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="ceilometer-central-agent" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.155160 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="sg-core" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.155171 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="proxy-httpd" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.155189 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8894eb8-5eee-48da-81e5-1a98616c3a1f" containerName="dnsmasq-dns" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.155200 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" containerName="ceilometer-notification-agent" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.156800 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.159704 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.164276 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.165948 4835 scope.go:117] "RemoveContainer" containerID="c5f41bf19843de65c7058a557f1bb07691dfa9cbab75c48bfb2e354fa6c428d4" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.166214 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.166447 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.195737 4835 scope.go:117] "RemoveContainer" containerID="0c5061c706e1924d68bd435577b3a879924ba6a43d453f8054d879c4d2769b1c" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.239077 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-74c74d79b4-rfzlk" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.248058 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:48:13 crc kubenswrapper[4835]: E1002 11:48:13.248931 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-q6vfj log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="c2c1dd51-5647-44ee-8d4f-3dec04c5849c" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.296406 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-config-data\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.296469 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.296555 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.296595 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-run-httpd\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.296633 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.296672 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6vfj\" (UniqueName: \"kubernetes.io/projected/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-kube-api-access-q6vfj\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.296730 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-scripts\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.296934 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-log-httpd\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.398936 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6vfj\" (UniqueName: \"kubernetes.io/projected/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-kube-api-access-q6vfj\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.399002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-scripts\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.399110 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-log-httpd\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.399187 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-config-data\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.399235 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.399285 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.399336 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-run-httpd\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.399381 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.399987 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-log-httpd\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.400641 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-run-httpd\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.404607 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-config-data\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.404640 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.404744 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.405598 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.407935 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-scripts\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:13 crc kubenswrapper[4835]: I1002 11:48:13.417062 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6vfj\" (UniqueName: \"kubernetes.io/projected/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-kube-api-access-q6vfj\") pod \"ceilometer-0\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " pod="openstack/ceilometer-0" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.072772 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.086673 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.219086 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-combined-ca-bundle\") pod \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.219168 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-run-httpd\") pod \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.219195 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-ceilometer-tls-certs\") pod \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.219295 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-sg-core-conf-yaml\") pod \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.219386 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-config-data\") pod \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.219509 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-log-httpd\") pod \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.219538 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6vfj\" (UniqueName: \"kubernetes.io/projected/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-kube-api-access-q6vfj\") pod \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.219566 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-scripts\") pod \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\" (UID: \"c2c1dd51-5647-44ee-8d4f-3dec04c5849c\") " Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.219610 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c2c1dd51-5647-44ee-8d4f-3dec04c5849c" (UID: "c2c1dd51-5647-44ee-8d4f-3dec04c5849c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.219846 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c2c1dd51-5647-44ee-8d4f-3dec04c5849c" (UID: "c2c1dd51-5647-44ee-8d4f-3dec04c5849c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.220328 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.220428 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.226167 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-config-data" (OuterVolumeSpecName: "config-data") pod "c2c1dd51-5647-44ee-8d4f-3dec04c5849c" (UID: "c2c1dd51-5647-44ee-8d4f-3dec04c5849c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.226191 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-scripts" (OuterVolumeSpecName: "scripts") pod "c2c1dd51-5647-44ee-8d4f-3dec04c5849c" (UID: "c2c1dd51-5647-44ee-8d4f-3dec04c5849c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.226365 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-kube-api-access-q6vfj" (OuterVolumeSpecName: "kube-api-access-q6vfj") pod "c2c1dd51-5647-44ee-8d4f-3dec04c5849c" (UID: "c2c1dd51-5647-44ee-8d4f-3dec04c5849c"). InnerVolumeSpecName "kube-api-access-q6vfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.226868 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c2c1dd51-5647-44ee-8d4f-3dec04c5849c" (UID: "c2c1dd51-5647-44ee-8d4f-3dec04c5849c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.227767 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2c1dd51-5647-44ee-8d4f-3dec04c5849c" (UID: "c2c1dd51-5647-44ee-8d4f-3dec04c5849c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.228683 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c2c1dd51-5647-44ee-8d4f-3dec04c5849c" (UID: "c2c1dd51-5647-44ee-8d4f-3dec04c5849c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.263085 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8894eb8-5eee-48da-81e5-1a98616c3a1f" path="/var/lib/kubelet/pods/b8894eb8-5eee-48da-81e5-1a98616c3a1f/volumes" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.263726 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9bd12da-7335-4fac-84d4-36e3f674a435" path="/var/lib/kubelet/pods/e9bd12da-7335-4fac-84d4-36e3f674a435/volumes" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.322658 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.322708 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6vfj\" (UniqueName: \"kubernetes.io/projected/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-kube-api-access-q6vfj\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.322723 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.322734 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.322747 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:14 crc kubenswrapper[4835]: I1002 11:48:14.322758 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2c1dd51-5647-44ee-8d4f-3dec04c5849c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.082509 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.132637 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.137758 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.152641 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.155050 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.163770 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.163899 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.164203 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.189361 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.339464 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec2a0f3a-56b8-4866-9740-d6499077797a-log-httpd\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.339543 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-config-data\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.339656 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.339687 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-scripts\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.339704 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec2a0f3a-56b8-4866-9740-d6499077797a-run-httpd\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.339748 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vfjb\" (UniqueName: \"kubernetes.io/projected/ec2a0f3a-56b8-4866-9740-d6499077797a-kube-api-access-6vfjb\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.339786 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.339809 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.441309 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec2a0f3a-56b8-4866-9740-d6499077797a-log-httpd\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.441392 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-config-data\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.441477 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.441503 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-scripts\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.441523 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec2a0f3a-56b8-4866-9740-d6499077797a-run-httpd\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.441562 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vfjb\" (UniqueName: \"kubernetes.io/projected/ec2a0f3a-56b8-4866-9740-d6499077797a-kube-api-access-6vfjb\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.441600 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.441622 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.443073 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec2a0f3a-56b8-4866-9740-d6499077797a-run-httpd\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.443177 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec2a0f3a-56b8-4866-9740-d6499077797a-log-httpd\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.446335 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-config-data\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.446593 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.446615 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.446990 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-scripts\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.447309 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec2a0f3a-56b8-4866-9740-d6499077797a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.466899 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vfjb\" (UniqueName: \"kubernetes.io/projected/ec2a0f3a-56b8-4866-9740-d6499077797a-kube-api-access-6vfjb\") pod \"ceilometer-0\" (UID: \"ec2a0f3a-56b8-4866-9740-d6499077797a\") " pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.482232 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:48:15 crc kubenswrapper[4835]: W1002 11:48:15.936469 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec2a0f3a_56b8_4866_9740_d6499077797a.slice/crio-66ff856b036fe7c2d33bb4d7ce3f122797ce5f512b050bd0cebad41ecc6b0790 WatchSource:0}: Error finding container 66ff856b036fe7c2d33bb4d7ce3f122797ce5f512b050bd0cebad41ecc6b0790: Status 404 returned error can't find the container with id 66ff856b036fe7c2d33bb4d7ce3f122797ce5f512b050bd0cebad41ecc6b0790 Oct 02 11:48:15 crc kubenswrapper[4835]: I1002 11:48:15.941836 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:48:16 crc kubenswrapper[4835]: I1002 11:48:16.092897 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec2a0f3a-56b8-4866-9740-d6499077797a","Type":"ContainerStarted","Data":"66ff856b036fe7c2d33bb4d7ce3f122797ce5f512b050bd0cebad41ecc6b0790"} Oct 02 11:48:16 crc kubenswrapper[4835]: I1002 11:48:16.265551 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c1dd51-5647-44ee-8d4f-3dec04c5849c" path="/var/lib/kubelet/pods/c2c1dd51-5647-44ee-8d4f-3dec04c5849c/volumes" Oct 02 11:48:17 crc kubenswrapper[4835]: I1002 11:48:17.104214 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec2a0f3a-56b8-4866-9740-d6499077797a","Type":"ContainerStarted","Data":"fdc01b5a231bc9224706ac432b478187e2805b2a21ba9b0380ff7945e2f56a06"} Oct 02 11:48:17 crc kubenswrapper[4835]: I1002 11:48:17.251995 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:48:17 crc kubenswrapper[4835]: E1002 11:48:17.252784 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:48:18 crc kubenswrapper[4835]: I1002 11:48:18.117870 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec2a0f3a-56b8-4866-9740-d6499077797a","Type":"ContainerStarted","Data":"beec638b5dd64b1e5fbc5c2debdab00f6743eb620c38c47b2d1cda4cea6b2e16"} Oct 02 11:48:19 crc kubenswrapper[4835]: I1002 11:48:19.130155 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec2a0f3a-56b8-4866-9740-d6499077797a","Type":"ContainerStarted","Data":"f6765a61b39529ec5568fba98b38afe56ed6188fc6c88477f4b6e57df5fb70f5"} Oct 02 11:48:20 crc kubenswrapper[4835]: I1002 11:48:20.352636 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 02 11:48:21 crc kubenswrapper[4835]: I1002 11:48:21.152081 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec2a0f3a-56b8-4866-9740-d6499077797a","Type":"ContainerStarted","Data":"d1d090974b5e3d4b2afda85df5c759b376f7963b013e4fa9e32a378d9c3f7fe3"} Oct 02 11:48:21 crc kubenswrapper[4835]: I1002 11:48:21.152697 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:48:21 crc kubenswrapper[4835]: I1002 11:48:21.179083 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.4810578300000001 podStartE2EDuration="6.179048605s" podCreationTimestamp="2025-10-02 11:48:15 +0000 UTC" firstStartedPulling="2025-10-02 11:48:15.9393569 +0000 UTC m=+3172.499264481" lastFinishedPulling="2025-10-02 11:48:20.637347675 +0000 UTC m=+3177.197255256" observedRunningTime="2025-10-02 11:48:21.173011901 +0000 UTC m=+3177.732919502" watchObservedRunningTime="2025-10-02 11:48:21.179048605 +0000 UTC m=+3177.738956186" Oct 02 11:48:22 crc kubenswrapper[4835]: I1002 11:48:22.034388 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 02 11:48:22 crc kubenswrapper[4835]: I1002 11:48:22.074763 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 11:48:22 crc kubenswrapper[4835]: I1002 11:48:22.160130 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="04247211-0a2a-4cf7-989f-87a89e84bca6" containerName="manila-scheduler" containerID="cri-o://3e6b25fdc23ea30691cbc0bb2c49987854cbeb6824979fa95656432e01d32b06" gracePeriod=30 Oct 02 11:48:22 crc kubenswrapper[4835]: I1002 11:48:22.160231 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="04247211-0a2a-4cf7-989f-87a89e84bca6" containerName="probe" containerID="cri-o://ab6fff716d7a0f23dcabc2acd1c93bf49365fcc2e306559382d8c5f929a29c18" gracePeriod=30 Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.181783 4835 generic.go:334] "Generic (PLEG): container finished" podID="04247211-0a2a-4cf7-989f-87a89e84bca6" containerID="ab6fff716d7a0f23dcabc2acd1c93bf49365fcc2e306559382d8c5f929a29c18" exitCode=0 Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.182046 4835 generic.go:334] "Generic (PLEG): container finished" podID="04247211-0a2a-4cf7-989f-87a89e84bca6" containerID="3e6b25fdc23ea30691cbc0bb2c49987854cbeb6824979fa95656432e01d32b06" exitCode=0 Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.181962 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"04247211-0a2a-4cf7-989f-87a89e84bca6","Type":"ContainerDied","Data":"ab6fff716d7a0f23dcabc2acd1c93bf49365fcc2e306559382d8c5f929a29c18"} Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.182087 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"04247211-0a2a-4cf7-989f-87a89e84bca6","Type":"ContainerDied","Data":"3e6b25fdc23ea30691cbc0bb2c49987854cbeb6824979fa95656432e01d32b06"} Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.239539 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-74c74d79b4-rfzlk" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.454633 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.614208 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjmd2\" (UniqueName: \"kubernetes.io/projected/04247211-0a2a-4cf7-989f-87a89e84bca6-kube-api-access-wjmd2\") pod \"04247211-0a2a-4cf7-989f-87a89e84bca6\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.614352 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data\") pod \"04247211-0a2a-4cf7-989f-87a89e84bca6\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.614586 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-scripts\") pod \"04247211-0a2a-4cf7-989f-87a89e84bca6\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.614756 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04247211-0a2a-4cf7-989f-87a89e84bca6-etc-machine-id\") pod \"04247211-0a2a-4cf7-989f-87a89e84bca6\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.614856 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data-custom\") pod \"04247211-0a2a-4cf7-989f-87a89e84bca6\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.614915 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04247211-0a2a-4cf7-989f-87a89e84bca6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "04247211-0a2a-4cf7-989f-87a89e84bca6" (UID: "04247211-0a2a-4cf7-989f-87a89e84bca6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.614931 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-combined-ca-bundle\") pod \"04247211-0a2a-4cf7-989f-87a89e84bca6\" (UID: \"04247211-0a2a-4cf7-989f-87a89e84bca6\") " Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.615957 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04247211-0a2a-4cf7-989f-87a89e84bca6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.620912 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-scripts" (OuterVolumeSpecName: "scripts") pod "04247211-0a2a-4cf7-989f-87a89e84bca6" (UID: "04247211-0a2a-4cf7-989f-87a89e84bca6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.623407 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04247211-0a2a-4cf7-989f-87a89e84bca6-kube-api-access-wjmd2" (OuterVolumeSpecName: "kube-api-access-wjmd2") pod "04247211-0a2a-4cf7-989f-87a89e84bca6" (UID: "04247211-0a2a-4cf7-989f-87a89e84bca6"). InnerVolumeSpecName "kube-api-access-wjmd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.626562 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "04247211-0a2a-4cf7-989f-87a89e84bca6" (UID: "04247211-0a2a-4cf7-989f-87a89e84bca6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.686264 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04247211-0a2a-4cf7-989f-87a89e84bca6" (UID: "04247211-0a2a-4cf7-989f-87a89e84bca6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.717560 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.717590 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.717600 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjmd2\" (UniqueName: \"kubernetes.io/projected/04247211-0a2a-4cf7-989f-87a89e84bca6-kube-api-access-wjmd2\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.717610 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.733654 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data" (OuterVolumeSpecName: "config-data") pod "04247211-0a2a-4cf7-989f-87a89e84bca6" (UID: "04247211-0a2a-4cf7-989f-87a89e84bca6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:23 crc kubenswrapper[4835]: I1002 11:48:23.819816 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04247211-0a2a-4cf7-989f-87a89e84bca6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.194589 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"04247211-0a2a-4cf7-989f-87a89e84bca6","Type":"ContainerDied","Data":"33c3d098bf92aebdde60146fee051412e4991a2b6302bf546c7981101eb337b9"} Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.194665 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.194967 4835 scope.go:117] "RemoveContainer" containerID="ab6fff716d7a0f23dcabc2acd1c93bf49365fcc2e306559382d8c5f929a29c18" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.232403 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.233418 4835 scope.go:117] "RemoveContainer" containerID="3e6b25fdc23ea30691cbc0bb2c49987854cbeb6824979fa95656432e01d32b06" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.243554 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.280722 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04247211-0a2a-4cf7-989f-87a89e84bca6" path="/var/lib/kubelet/pods/04247211-0a2a-4cf7-989f-87a89e84bca6/volumes" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.290388 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 11:48:24 crc kubenswrapper[4835]: E1002 11:48:24.290948 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04247211-0a2a-4cf7-989f-87a89e84bca6" containerName="manila-scheduler" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.290971 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="04247211-0a2a-4cf7-989f-87a89e84bca6" containerName="manila-scheduler" Oct 02 11:48:24 crc kubenswrapper[4835]: E1002 11:48:24.291014 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04247211-0a2a-4cf7-989f-87a89e84bca6" containerName="probe" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.291024 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="04247211-0a2a-4cf7-989f-87a89e84bca6" containerName="probe" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.291330 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="04247211-0a2a-4cf7-989f-87a89e84bca6" containerName="manila-scheduler" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.291392 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="04247211-0a2a-4cf7-989f-87a89e84bca6" containerName="probe" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.292818 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.295087 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.300086 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.431146 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.431510 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-scripts\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.431633 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36fa1fd3-bc15-4b6a-912f-91bc6942c407-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.431734 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk4cm\" (UniqueName: \"kubernetes.io/projected/36fa1fd3-bc15-4b6a-912f-91bc6942c407-kube-api-access-sk4cm\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.431854 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.431990 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-config-data\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.533529 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-config-data\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.533632 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.533662 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-scripts\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.533683 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36fa1fd3-bc15-4b6a-912f-91bc6942c407-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.533714 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk4cm\" (UniqueName: \"kubernetes.io/projected/36fa1fd3-bc15-4b6a-912f-91bc6942c407-kube-api-access-sk4cm\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.533755 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.539866 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-scripts\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.546481 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36fa1fd3-bc15-4b6a-912f-91bc6942c407-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.550370 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.553006 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk4cm\" (UniqueName: \"kubernetes.io/projected/36fa1fd3-bc15-4b6a-912f-91bc6942c407-kube-api-access-sk4cm\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.564105 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-config-data\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.564202 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36fa1fd3-bc15-4b6a-912f-91bc6942c407-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"36fa1fd3-bc15-4b6a-912f-91bc6942c407\") " pod="openstack/manila-scheduler-0" Oct 02 11:48:24 crc kubenswrapper[4835]: I1002 11:48:24.612385 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 11:48:25 crc kubenswrapper[4835]: I1002 11:48:25.047595 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 11:48:25 crc kubenswrapper[4835]: I1002 11:48:25.210151 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"36fa1fd3-bc15-4b6a-912f-91bc6942c407","Type":"ContainerStarted","Data":"f20eb10035165a85d66434663affe5dc2c8e5e6be745b794fb739d591017fc36"} Oct 02 11:48:26 crc kubenswrapper[4835]: I1002 11:48:26.225355 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"36fa1fd3-bc15-4b6a-912f-91bc6942c407","Type":"ContainerStarted","Data":"1ef18abc8a928398159e5e8b252d903d9dd1b4db679418fd847b9f0f6c7fca44"} Oct 02 11:48:26 crc kubenswrapper[4835]: I1002 11:48:26.225807 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"36fa1fd3-bc15-4b6a-912f-91bc6942c407","Type":"ContainerStarted","Data":"0336ea3279af05e082b9b7a128cedc523ffcb0b2c2a7e06e2cd71e8aed53f49a"} Oct 02 11:48:26 crc kubenswrapper[4835]: I1002 11:48:26.256971 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.256949672 podStartE2EDuration="2.256949672s" podCreationTimestamp="2025-10-02 11:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:48:26.246648925 +0000 UTC m=+3182.806556536" watchObservedRunningTime="2025-10-02 11:48:26.256949672 +0000 UTC m=+3182.816857273" Oct 02 11:48:27 crc kubenswrapper[4835]: I1002 11:48:27.091446 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 02 11:48:31 crc kubenswrapper[4835]: I1002 11:48:31.253286 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:48:31 crc kubenswrapper[4835]: E1002 11:48:31.254179 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:48:31 crc kubenswrapper[4835]: I1002 11:48:31.935471 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 02 11:48:31 crc kubenswrapper[4835]: I1002 11:48:31.990063 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 11:48:32 crc kubenswrapper[4835]: I1002 11:48:32.288939 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="1730f4f8-033a-43b9-8081-22825c549cc0" containerName="manila-share" containerID="cri-o://24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255" gracePeriod=30 Oct 02 11:48:32 crc kubenswrapper[4835]: I1002 11:48:32.288962 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="1730f4f8-033a-43b9-8081-22825c549cc0" containerName="probe" containerID="cri-o://a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991" gracePeriod=30 Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.157872 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.239562 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-74c74d79b4-rfzlk" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.239817 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.308508 4835 generic.go:334] "Generic (PLEG): container finished" podID="1730f4f8-033a-43b9-8081-22825c549cc0" containerID="a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991" exitCode=0 Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.308553 4835 generic.go:334] "Generic (PLEG): container finished" podID="1730f4f8-033a-43b9-8081-22825c549cc0" containerID="24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255" exitCode=1 Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.308584 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1730f4f8-033a-43b9-8081-22825c549cc0","Type":"ContainerDied","Data":"a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991"} Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.308596 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.308625 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1730f4f8-033a-43b9-8081-22825c549cc0","Type":"ContainerDied","Data":"24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255"} Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.308636 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1730f4f8-033a-43b9-8081-22825c549cc0","Type":"ContainerDied","Data":"39da1b6a2063d520a7d0b0edc3736afc8d911cc88b123ac1b42ecb435d03f8ec"} Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.308655 4835 scope.go:117] "RemoveContainer" containerID="a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.329949 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data\") pod \"1730f4f8-033a-43b9-8081-22825c549cc0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.330063 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-ceph\") pod \"1730f4f8-033a-43b9-8081-22825c549cc0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.330120 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-scripts\") pod \"1730f4f8-033a-43b9-8081-22825c549cc0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.330151 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data-custom\") pod \"1730f4f8-033a-43b9-8081-22825c549cc0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.330197 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-etc-machine-id\") pod \"1730f4f8-033a-43b9-8081-22825c549cc0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.330269 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g77g\" (UniqueName: \"kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-kube-api-access-8g77g\") pod \"1730f4f8-033a-43b9-8081-22825c549cc0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.330408 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-var-lib-manila\") pod \"1730f4f8-033a-43b9-8081-22825c549cc0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.330465 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-combined-ca-bundle\") pod \"1730f4f8-033a-43b9-8081-22825c549cc0\" (UID: \"1730f4f8-033a-43b9-8081-22825c549cc0\") " Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.331766 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "1730f4f8-033a-43b9-8081-22825c549cc0" (UID: "1730f4f8-033a-43b9-8081-22825c549cc0"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.334415 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1730f4f8-033a-43b9-8081-22825c549cc0" (UID: "1730f4f8-033a-43b9-8081-22825c549cc0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.341815 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-scripts" (OuterVolumeSpecName: "scripts") pod "1730f4f8-033a-43b9-8081-22825c549cc0" (UID: "1730f4f8-033a-43b9-8081-22825c549cc0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.341881 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-ceph" (OuterVolumeSpecName: "ceph") pod "1730f4f8-033a-43b9-8081-22825c549cc0" (UID: "1730f4f8-033a-43b9-8081-22825c549cc0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.341941 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-kube-api-access-8g77g" (OuterVolumeSpecName: "kube-api-access-8g77g") pod "1730f4f8-033a-43b9-8081-22825c549cc0" (UID: "1730f4f8-033a-43b9-8081-22825c549cc0"). InnerVolumeSpecName "kube-api-access-8g77g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.344495 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1730f4f8-033a-43b9-8081-22825c549cc0" (UID: "1730f4f8-033a-43b9-8081-22825c549cc0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.344786 4835 scope.go:117] "RemoveContainer" containerID="24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.404103 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1730f4f8-033a-43b9-8081-22825c549cc0" (UID: "1730f4f8-033a-43b9-8081-22825c549cc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.432788 4835 scope.go:117] "RemoveContainer" containerID="a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.434456 4835 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.434678 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.434760 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.434851 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.434925 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g77g\" (UniqueName: \"kubernetes.io/projected/1730f4f8-033a-43b9-8081-22825c549cc0-kube-api-access-8g77g\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.434998 4835 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1730f4f8-033a-43b9-8081-22825c549cc0-var-lib-manila\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.435064 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:33 crc kubenswrapper[4835]: E1002 11:48:33.436280 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991\": container with ID starting with a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991 not found: ID does not exist" containerID="a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.436353 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991"} err="failed to get container status \"a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991\": rpc error: code = NotFound desc = could not find container \"a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991\": container with ID starting with a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991 not found: ID does not exist" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.436388 4835 scope.go:117] "RemoveContainer" containerID="24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255" Oct 02 11:48:33 crc kubenswrapper[4835]: E1002 11:48:33.436698 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255\": container with ID starting with 24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255 not found: ID does not exist" containerID="24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.436727 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255"} err="failed to get container status \"24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255\": rpc error: code = NotFound desc = could not find container \"24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255\": container with ID starting with 24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255 not found: ID does not exist" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.436744 4835 scope.go:117] "RemoveContainer" containerID="a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.437027 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991"} err="failed to get container status \"a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991\": rpc error: code = NotFound desc = could not find container \"a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991\": container with ID starting with a85ab0d2e3126ad4630ad8d24ddcc23880832aab164d7aed6b56a8b3a0b74991 not found: ID does not exist" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.437049 4835 scope.go:117] "RemoveContainer" containerID="24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.437641 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255"} err="failed to get container status \"24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255\": rpc error: code = NotFound desc = could not find container \"24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255\": container with ID starting with 24223ef33dba7439b1953ab42bf217bb74dbc4c1450b3c9a3cbd97a55d8ab255 not found: ID does not exist" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.449879 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data" (OuterVolumeSpecName: "config-data") pod "1730f4f8-033a-43b9-8081-22825c549cc0" (UID: "1730f4f8-033a-43b9-8081-22825c549cc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.537551 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1730f4f8-033a-43b9-8081-22825c549cc0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.683931 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.696975 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.718459 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 11:48:33 crc kubenswrapper[4835]: E1002 11:48:33.718842 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1730f4f8-033a-43b9-8081-22825c549cc0" containerName="probe" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.718859 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1730f4f8-033a-43b9-8081-22825c549cc0" containerName="probe" Oct 02 11:48:33 crc kubenswrapper[4835]: E1002 11:48:33.718885 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1730f4f8-033a-43b9-8081-22825c549cc0" containerName="manila-share" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.718891 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1730f4f8-033a-43b9-8081-22825c549cc0" containerName="manila-share" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.731684 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1730f4f8-033a-43b9-8081-22825c549cc0" containerName="manila-share" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.731728 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1730f4f8-033a-43b9-8081-22825c549cc0" containerName="probe" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.734720 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.734869 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.738261 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.847646 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.847685 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-config-data\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.847708 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.847820 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kw4v\" (UniqueName: \"kubernetes.io/projected/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-kube-api-access-9kw4v\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.847889 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.847908 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-ceph\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.848057 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-scripts\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.848160 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.950982 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kw4v\" (UniqueName: \"kubernetes.io/projected/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-kube-api-access-9kw4v\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.951196 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.951273 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-ceph\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.951337 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-scripts\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.951404 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.951476 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.951525 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-config-data\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.951569 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.951765 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.952733 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.955616 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-ceph\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.956133 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.956204 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-config-data\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.957032 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-scripts\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.968482 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:33 crc kubenswrapper[4835]: I1002 11:48:33.980574 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kw4v\" (UniqueName: \"kubernetes.io/projected/8cd67eac-f445-4d9e-b0ef-26de2604c1bd-kube-api-access-9kw4v\") pod \"manila-share-share1-0\" (UID: \"8cd67eac-f445-4d9e-b0ef-26de2604c1bd\") " pod="openstack/manila-share-share1-0" Oct 02 11:48:34 crc kubenswrapper[4835]: I1002 11:48:34.060533 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 11:48:34 crc kubenswrapper[4835]: I1002 11:48:34.283179 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1730f4f8-033a-43b9-8081-22825c549cc0" path="/var/lib/kubelet/pods/1730f4f8-033a-43b9-8081-22825c549cc0/volumes" Oct 02 11:48:34 crc kubenswrapper[4835]: I1002 11:48:34.613388 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 02 11:48:34 crc kubenswrapper[4835]: I1002 11:48:34.659785 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 11:48:35 crc kubenswrapper[4835]: I1002 11:48:35.378971 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8cd67eac-f445-4d9e-b0ef-26de2604c1bd","Type":"ContainerStarted","Data":"d2fa5e5c11b1bb6991b4a5be7eb6a910c6e7694fc8cb8cffbb25c062bfd5320e"} Oct 02 11:48:35 crc kubenswrapper[4835]: I1002 11:48:35.379824 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8cd67eac-f445-4d9e-b0ef-26de2604c1bd","Type":"ContainerStarted","Data":"6e435870043ef7b86af1868b77cd14c056ceb048a9cb903efec00b310f7033bd"} Oct 02 11:48:36 crc kubenswrapper[4835]: I1002 11:48:36.405122 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8cd67eac-f445-4d9e-b0ef-26de2604c1bd","Type":"ContainerStarted","Data":"5d287b916d5801541879d9269e030fbf47b53670b89134a352665442c67586d1"} Oct 02 11:48:36 crc kubenswrapper[4835]: I1002 11:48:36.439789 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.439766914 podStartE2EDuration="3.439766914s" podCreationTimestamp="2025-10-02 11:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:48:36.432305329 +0000 UTC m=+3192.992212930" watchObservedRunningTime="2025-10-02 11:48:36.439766914 +0000 UTC m=+3192.999674495" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.422833 4835 generic.go:334] "Generic (PLEG): container finished" podID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerID="150a942f6a8baada3743cbe30652d6c9181e7f38db63213ad3b8f85e914cae23" exitCode=137 Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.422868 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74c74d79b4-rfzlk" event={"ID":"37beff21-75a0-4297-a84b-9a34ccb1d2e0","Type":"ContainerDied","Data":"150a942f6a8baada3743cbe30652d6c9181e7f38db63213ad3b8f85e914cae23"} Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.423471 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74c74d79b4-rfzlk" event={"ID":"37beff21-75a0-4297-a84b-9a34ccb1d2e0","Type":"ContainerDied","Data":"82e0441e39723c666697d56aa0ce7f5be7c4d4b9a8181f4de9d928ea063493af"} Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.423486 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82e0441e39723c666697d56aa0ce7f5be7c4d4b9a8181f4de9d928ea063493af" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.461729 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.556585 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-secret-key\") pod \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.556686 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-config-data\") pod \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.556704 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-tls-certs\") pod \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.556733 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37beff21-75a0-4297-a84b-9a34ccb1d2e0-logs\") pod \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.556818 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-scripts\") pod \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.556854 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-combined-ca-bundle\") pod \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.556886 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrxkx\" (UniqueName: \"kubernetes.io/projected/37beff21-75a0-4297-a84b-9a34ccb1d2e0-kube-api-access-zrxkx\") pod \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\" (UID: \"37beff21-75a0-4297-a84b-9a34ccb1d2e0\") " Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.557969 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37beff21-75a0-4297-a84b-9a34ccb1d2e0-logs" (OuterVolumeSpecName: "logs") pod "37beff21-75a0-4297-a84b-9a34ccb1d2e0" (UID: "37beff21-75a0-4297-a84b-9a34ccb1d2e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.574554 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "37beff21-75a0-4297-a84b-9a34ccb1d2e0" (UID: "37beff21-75a0-4297-a84b-9a34ccb1d2e0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.574628 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37beff21-75a0-4297-a84b-9a34ccb1d2e0-kube-api-access-zrxkx" (OuterVolumeSpecName: "kube-api-access-zrxkx") pod "37beff21-75a0-4297-a84b-9a34ccb1d2e0" (UID: "37beff21-75a0-4297-a84b-9a34ccb1d2e0"). InnerVolumeSpecName "kube-api-access-zrxkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.581754 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-config-data" (OuterVolumeSpecName: "config-data") pod "37beff21-75a0-4297-a84b-9a34ccb1d2e0" (UID: "37beff21-75a0-4297-a84b-9a34ccb1d2e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.584594 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-scripts" (OuterVolumeSpecName: "scripts") pod "37beff21-75a0-4297-a84b-9a34ccb1d2e0" (UID: "37beff21-75a0-4297-a84b-9a34ccb1d2e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.591530 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37beff21-75a0-4297-a84b-9a34ccb1d2e0" (UID: "37beff21-75a0-4297-a84b-9a34ccb1d2e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.626560 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "37beff21-75a0-4297-a84b-9a34ccb1d2e0" (UID: "37beff21-75a0-4297-a84b-9a34ccb1d2e0"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.658472 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.658501 4835 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.658511 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37beff21-75a0-4297-a84b-9a34ccb1d2e0-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.658519 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37beff21-75a0-4297-a84b-9a34ccb1d2e0-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.658527 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.658538 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrxkx\" (UniqueName: \"kubernetes.io/projected/37beff21-75a0-4297-a84b-9a34ccb1d2e0-kube-api-access-zrxkx\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:38 crc kubenswrapper[4835]: I1002 11:48:38.658546 4835 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/37beff21-75a0-4297-a84b-9a34ccb1d2e0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:48:39 crc kubenswrapper[4835]: I1002 11:48:39.432154 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74c74d79b4-rfzlk" Oct 02 11:48:39 crc kubenswrapper[4835]: I1002 11:48:39.473527 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74c74d79b4-rfzlk"] Oct 02 11:48:39 crc kubenswrapper[4835]: I1002 11:48:39.486637 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-74c74d79b4-rfzlk"] Oct 02 11:48:40 crc kubenswrapper[4835]: I1002 11:48:40.263010 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" path="/var/lib/kubelet/pods/37beff21-75a0-4297-a84b-9a34ccb1d2e0/volumes" Oct 02 11:48:43 crc kubenswrapper[4835]: I1002 11:48:43.253159 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:48:43 crc kubenswrapper[4835]: E1002 11:48:43.253790 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:48:44 crc kubenswrapper[4835]: I1002 11:48:44.061414 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 02 11:48:45 crc kubenswrapper[4835]: I1002 11:48:45.517783 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:48:46 crc kubenswrapper[4835]: I1002 11:48:46.122855 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 02 11:48:55 crc kubenswrapper[4835]: I1002 11:48:55.689146 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 02 11:48:56 crc kubenswrapper[4835]: I1002 11:48:56.252520 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:48:56 crc kubenswrapper[4835]: E1002 11:48:56.253046 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:49:08 crc kubenswrapper[4835]: I1002 11:49:08.255858 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:49:08 crc kubenswrapper[4835]: E1002 11:49:08.256584 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:49:21 crc kubenswrapper[4835]: I1002 11:49:21.252469 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:49:21 crc kubenswrapper[4835]: E1002 11:49:21.254391 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:49:35 crc kubenswrapper[4835]: I1002 11:49:35.251749 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:49:35 crc kubenswrapper[4835]: E1002 11:49:35.252697 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.719851 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 11:49:45 crc kubenswrapper[4835]: E1002 11:49:45.720743 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon-log" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.720755 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon-log" Oct 02 11:49:45 crc kubenswrapper[4835]: E1002 11:49:45.720780 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.720788 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.720978 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.720990 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="37beff21-75a0-4297-a84b-9a34ccb1d2e0" containerName="horizon-log" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.730358 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.730453 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.742935 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.743124 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.743363 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lrw8k" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.743611 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.847050 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.847127 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.847162 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.847215 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.847286 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-config-data\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.847322 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.847348 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whzxd\" (UniqueName: \"kubernetes.io/projected/5922a15b-856f-45aa-aed9-d8787e4f470f-kube-api-access-whzxd\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.847404 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.847506 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.948862 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.948936 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.948967 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.948994 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.949042 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.949098 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-config-data\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.949135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.949694 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.949728 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.949802 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whzxd\" (UniqueName: \"kubernetes.io/projected/5922a15b-856f-45aa-aed9-d8787e4f470f-kube-api-access-whzxd\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.949872 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.950316 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.951634 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.951636 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-config-data\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.955939 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.956646 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.966589 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.970587 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whzxd\" (UniqueName: \"kubernetes.io/projected/5922a15b-856f-45aa-aed9-d8787e4f470f-kube-api-access-whzxd\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:45 crc kubenswrapper[4835]: I1002 11:49:45.996564 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " pod="openstack/tempest-tests-tempest" Oct 02 11:49:46 crc kubenswrapper[4835]: I1002 11:49:46.066173 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 11:49:46 crc kubenswrapper[4835]: I1002 11:49:46.503786 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 11:49:46 crc kubenswrapper[4835]: I1002 11:49:46.510772 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:49:47 crc kubenswrapper[4835]: I1002 11:49:47.112978 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5922a15b-856f-45aa-aed9-d8787e4f470f","Type":"ContainerStarted","Data":"a913d017a655c1ade9a45bbd6d633b49a76f302b7083b944c6e7c9e02eee428e"} Oct 02 11:49:49 crc kubenswrapper[4835]: I1002 11:49:49.252580 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:49:49 crc kubenswrapper[4835]: E1002 11:49:49.253190 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:50:04 crc kubenswrapper[4835]: I1002 11:50:04.258873 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:50:04 crc kubenswrapper[4835]: E1002 11:50:04.259632 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:50:14 crc kubenswrapper[4835]: E1002 11:50:14.189289 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 02 11:50:14 crc kubenswrapper[4835]: E1002 11:50:14.190005 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-whzxd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(5922a15b-856f-45aa-aed9-d8787e4f470f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:50:14 crc kubenswrapper[4835]: E1002 11:50:14.192138 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="5922a15b-856f-45aa-aed9-d8787e4f470f" Oct 02 11:50:14 crc kubenswrapper[4835]: E1002 11:50:14.389451 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="5922a15b-856f-45aa-aed9-d8787e4f470f" Oct 02 11:50:15 crc kubenswrapper[4835]: I1002 11:50:15.252963 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:50:15 crc kubenswrapper[4835]: E1002 11:50:15.253282 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:50:28 crc kubenswrapper[4835]: I1002 11:50:28.861936 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 02 11:50:29 crc kubenswrapper[4835]: I1002 11:50:29.252278 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:50:29 crc kubenswrapper[4835]: E1002 11:50:29.252702 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:50:30 crc kubenswrapper[4835]: I1002 11:50:30.535002 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5922a15b-856f-45aa-aed9-d8787e4f470f","Type":"ContainerStarted","Data":"9c236c621536713ca9f903b52acc230578e65ecedfc67236d5d73141893a30e6"} Oct 02 11:50:30 crc kubenswrapper[4835]: I1002 11:50:30.557601 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.208620606 podStartE2EDuration="46.557580512s" podCreationTimestamp="2025-10-02 11:49:44 +0000 UTC" firstStartedPulling="2025-10-02 11:49:46.510534177 +0000 UTC m=+3263.070441758" lastFinishedPulling="2025-10-02 11:50:28.859494083 +0000 UTC m=+3305.419401664" observedRunningTime="2025-10-02 11:50:30.552762763 +0000 UTC m=+3307.112670354" watchObservedRunningTime="2025-10-02 11:50:30.557580512 +0000 UTC m=+3307.117488093" Oct 02 11:50:44 crc kubenswrapper[4835]: I1002 11:50:44.258866 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:50:44 crc kubenswrapper[4835]: E1002 11:50:44.259926 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:50:59 crc kubenswrapper[4835]: I1002 11:50:59.252469 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:50:59 crc kubenswrapper[4835]: E1002 11:50:59.253058 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:51:12 crc kubenswrapper[4835]: I1002 11:51:12.251688 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:51:12 crc kubenswrapper[4835]: E1002 11:51:12.252368 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:51:23 crc kubenswrapper[4835]: I1002 11:51:23.252190 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:51:23 crc kubenswrapper[4835]: E1002 11:51:23.253396 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:51:34 crc kubenswrapper[4835]: I1002 11:51:34.260661 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:51:34 crc kubenswrapper[4835]: E1002 11:51:34.261768 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:51:47 crc kubenswrapper[4835]: I1002 11:51:47.252374 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:51:47 crc kubenswrapper[4835]: E1002 11:51:47.253838 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:51:59 crc kubenswrapper[4835]: I1002 11:51:59.252158 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:51:59 crc kubenswrapper[4835]: E1002 11:51:59.252841 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:52:14 crc kubenswrapper[4835]: I1002 11:52:14.258180 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:52:14 crc kubenswrapper[4835]: E1002 11:52:14.258998 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:52:25 crc kubenswrapper[4835]: I1002 11:52:25.252408 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:52:25 crc kubenswrapper[4835]: E1002 11:52:25.253215 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:52:40 crc kubenswrapper[4835]: I1002 11:52:40.263775 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:52:40 crc kubenswrapper[4835]: E1002 11:52:40.264727 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:52:51 crc kubenswrapper[4835]: I1002 11:52:51.252610 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:52:51 crc kubenswrapper[4835]: I1002 11:52:51.821812 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"226253d9d759d74973616d813e5b5469b61bc70d02466d6f60a4aa36a28fe914"} Oct 02 11:54:00 crc kubenswrapper[4835]: I1002 11:54:00.472937 4835 scope.go:117] "RemoveContainer" containerID="3fa84eb5024fd4bca93b4d7c1edf20279eb1b53420ac17b2a4faecad1a8d2b02" Oct 02 11:54:00 crc kubenswrapper[4835]: I1002 11:54:00.669354 4835 scope.go:117] "RemoveContainer" containerID="150a942f6a8baada3743cbe30652d6c9181e7f38db63213ad3b8f85e914cae23" Oct 02 11:55:11 crc kubenswrapper[4835]: I1002 11:55:11.984009 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:55:11 crc kubenswrapper[4835]: I1002 11:55:11.984527 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.346039 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bgrmd"] Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.348745 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.374965 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bgrmd"] Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.468484 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-catalog-content\") pod \"certified-operators-bgrmd\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.468687 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-utilities\") pod \"certified-operators-bgrmd\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.468816 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6c6t\" (UniqueName: \"kubernetes.io/projected/ca8cc213-8ef4-4e95-b490-3b614b44e95c-kube-api-access-j6c6t\") pod \"certified-operators-bgrmd\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.570949 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6c6t\" (UniqueName: \"kubernetes.io/projected/ca8cc213-8ef4-4e95-b490-3b614b44e95c-kube-api-access-j6c6t\") pod \"certified-operators-bgrmd\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.571176 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-catalog-content\") pod \"certified-operators-bgrmd\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.571301 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-utilities\") pod \"certified-operators-bgrmd\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.571822 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-catalog-content\") pod \"certified-operators-bgrmd\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.571897 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-utilities\") pod \"certified-operators-bgrmd\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.598371 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6c6t\" (UniqueName: \"kubernetes.io/projected/ca8cc213-8ef4-4e95-b490-3b614b44e95c-kube-api-access-j6c6t\") pod \"certified-operators-bgrmd\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:29 crc kubenswrapper[4835]: I1002 11:55:29.683308 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:30 crc kubenswrapper[4835]: I1002 11:55:30.278805 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bgrmd"] Oct 02 11:55:31 crc kubenswrapper[4835]: I1002 11:55:31.263593 4835 generic.go:334] "Generic (PLEG): container finished" podID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" containerID="1ab53656edfbd9e346bae0c854b5237f0f69b085a03cbcbb98d139f8b28cc2f2" exitCode=0 Oct 02 11:55:31 crc kubenswrapper[4835]: I1002 11:55:31.263661 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgrmd" event={"ID":"ca8cc213-8ef4-4e95-b490-3b614b44e95c","Type":"ContainerDied","Data":"1ab53656edfbd9e346bae0c854b5237f0f69b085a03cbcbb98d139f8b28cc2f2"} Oct 02 11:55:31 crc kubenswrapper[4835]: I1002 11:55:31.264112 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgrmd" event={"ID":"ca8cc213-8ef4-4e95-b490-3b614b44e95c","Type":"ContainerStarted","Data":"575c3567251e560c8cc9645cd2d368ed46854505405e63a47a55909d86318c92"} Oct 02 11:55:31 crc kubenswrapper[4835]: I1002 11:55:31.266314 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:55:33 crc kubenswrapper[4835]: I1002 11:55:33.284849 4835 generic.go:334] "Generic (PLEG): container finished" podID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" containerID="92c23f36345d4be40c7d64578bcf7eb3a554d005965057c4d324f32c99991228" exitCode=0 Oct 02 11:55:33 crc kubenswrapper[4835]: I1002 11:55:33.284908 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgrmd" event={"ID":"ca8cc213-8ef4-4e95-b490-3b614b44e95c","Type":"ContainerDied","Data":"92c23f36345d4be40c7d64578bcf7eb3a554d005965057c4d324f32c99991228"} Oct 02 11:55:34 crc kubenswrapper[4835]: I1002 11:55:34.294849 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgrmd" event={"ID":"ca8cc213-8ef4-4e95-b490-3b614b44e95c","Type":"ContainerStarted","Data":"4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024"} Oct 02 11:55:34 crc kubenswrapper[4835]: I1002 11:55:34.324020 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bgrmd" podStartSLOduration=2.702748262 podStartE2EDuration="5.323996707s" podCreationTimestamp="2025-10-02 11:55:29 +0000 UTC" firstStartedPulling="2025-10-02 11:55:31.266058624 +0000 UTC m=+3607.825966205" lastFinishedPulling="2025-10-02 11:55:33.887307069 +0000 UTC m=+3610.447214650" observedRunningTime="2025-10-02 11:55:34.313450556 +0000 UTC m=+3610.873358147" watchObservedRunningTime="2025-10-02 11:55:34.323996707 +0000 UTC m=+3610.883904288" Oct 02 11:55:39 crc kubenswrapper[4835]: I1002 11:55:39.683523 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:39 crc kubenswrapper[4835]: I1002 11:55:39.684112 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:39 crc kubenswrapper[4835]: I1002 11:55:39.740585 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:40 crc kubenswrapper[4835]: I1002 11:55:40.409524 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:40 crc kubenswrapper[4835]: I1002 11:55:40.463019 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bgrmd"] Oct 02 11:55:41 crc kubenswrapper[4835]: I1002 11:55:41.984640 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:55:41 crc kubenswrapper[4835]: I1002 11:55:41.985984 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:55:42 crc kubenswrapper[4835]: I1002 11:55:42.374555 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bgrmd" podUID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" containerName="registry-server" containerID="cri-o://4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024" gracePeriod=2 Oct 02 11:55:42 crc kubenswrapper[4835]: I1002 11:55:42.993786 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.176653 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-utilities\") pod \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.176821 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6c6t\" (UniqueName: \"kubernetes.io/projected/ca8cc213-8ef4-4e95-b490-3b614b44e95c-kube-api-access-j6c6t\") pod \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.177089 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-catalog-content\") pod \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\" (UID: \"ca8cc213-8ef4-4e95-b490-3b614b44e95c\") " Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.177425 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-utilities" (OuterVolumeSpecName: "utilities") pod "ca8cc213-8ef4-4e95-b490-3b614b44e95c" (UID: "ca8cc213-8ef4-4e95-b490-3b614b44e95c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.177822 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.184278 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca8cc213-8ef4-4e95-b490-3b614b44e95c-kube-api-access-j6c6t" (OuterVolumeSpecName: "kube-api-access-j6c6t") pod "ca8cc213-8ef4-4e95-b490-3b614b44e95c" (UID: "ca8cc213-8ef4-4e95-b490-3b614b44e95c"). InnerVolumeSpecName "kube-api-access-j6c6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.226441 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca8cc213-8ef4-4e95-b490-3b614b44e95c" (UID: "ca8cc213-8ef4-4e95-b490-3b614b44e95c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.279948 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6c6t\" (UniqueName: \"kubernetes.io/projected/ca8cc213-8ef4-4e95-b490-3b614b44e95c-kube-api-access-j6c6t\") on node \"crc\" DevicePath \"\"" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.280001 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca8cc213-8ef4-4e95-b490-3b614b44e95c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.393880 4835 generic.go:334] "Generic (PLEG): container finished" podID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" containerID="4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024" exitCode=0 Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.393950 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgrmd" event={"ID":"ca8cc213-8ef4-4e95-b490-3b614b44e95c","Type":"ContainerDied","Data":"4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024"} Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.393981 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgrmd" event={"ID":"ca8cc213-8ef4-4e95-b490-3b614b44e95c","Type":"ContainerDied","Data":"575c3567251e560c8cc9645cd2d368ed46854505405e63a47a55909d86318c92"} Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.394006 4835 scope.go:117] "RemoveContainer" containerID="4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.394052 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgrmd" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.442127 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bgrmd"] Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.450328 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bgrmd"] Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.451819 4835 scope.go:117] "RemoveContainer" containerID="92c23f36345d4be40c7d64578bcf7eb3a554d005965057c4d324f32c99991228" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.497983 4835 scope.go:117] "RemoveContainer" containerID="1ab53656edfbd9e346bae0c854b5237f0f69b085a03cbcbb98d139f8b28cc2f2" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.523562 4835 scope.go:117] "RemoveContainer" containerID="4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024" Oct 02 11:55:43 crc kubenswrapper[4835]: E1002 11:55:43.524111 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024\": container with ID starting with 4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024 not found: ID does not exist" containerID="4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.524153 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024"} err="failed to get container status \"4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024\": rpc error: code = NotFound desc = could not find container \"4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024\": container with ID starting with 4a6d640376fce02bf06702e9828090d65a0c843fd01c0beebe3ba58058a7b024 not found: ID does not exist" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.524181 4835 scope.go:117] "RemoveContainer" containerID="92c23f36345d4be40c7d64578bcf7eb3a554d005965057c4d324f32c99991228" Oct 02 11:55:43 crc kubenswrapper[4835]: E1002 11:55:43.524784 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c23f36345d4be40c7d64578bcf7eb3a554d005965057c4d324f32c99991228\": container with ID starting with 92c23f36345d4be40c7d64578bcf7eb3a554d005965057c4d324f32c99991228 not found: ID does not exist" containerID="92c23f36345d4be40c7d64578bcf7eb3a554d005965057c4d324f32c99991228" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.524820 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c23f36345d4be40c7d64578bcf7eb3a554d005965057c4d324f32c99991228"} err="failed to get container status \"92c23f36345d4be40c7d64578bcf7eb3a554d005965057c4d324f32c99991228\": rpc error: code = NotFound desc = could not find container \"92c23f36345d4be40c7d64578bcf7eb3a554d005965057c4d324f32c99991228\": container with ID starting with 92c23f36345d4be40c7d64578bcf7eb3a554d005965057c4d324f32c99991228 not found: ID does not exist" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.524842 4835 scope.go:117] "RemoveContainer" containerID="1ab53656edfbd9e346bae0c854b5237f0f69b085a03cbcbb98d139f8b28cc2f2" Oct 02 11:55:43 crc kubenswrapper[4835]: E1002 11:55:43.525389 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab53656edfbd9e346bae0c854b5237f0f69b085a03cbcbb98d139f8b28cc2f2\": container with ID starting with 1ab53656edfbd9e346bae0c854b5237f0f69b085a03cbcbb98d139f8b28cc2f2 not found: ID does not exist" containerID="1ab53656edfbd9e346bae0c854b5237f0f69b085a03cbcbb98d139f8b28cc2f2" Oct 02 11:55:43 crc kubenswrapper[4835]: I1002 11:55:43.525424 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab53656edfbd9e346bae0c854b5237f0f69b085a03cbcbb98d139f8b28cc2f2"} err="failed to get container status \"1ab53656edfbd9e346bae0c854b5237f0f69b085a03cbcbb98d139f8b28cc2f2\": rpc error: code = NotFound desc = could not find container \"1ab53656edfbd9e346bae0c854b5237f0f69b085a03cbcbb98d139f8b28cc2f2\": container with ID starting with 1ab53656edfbd9e346bae0c854b5237f0f69b085a03cbcbb98d139f8b28cc2f2 not found: ID does not exist" Oct 02 11:55:44 crc kubenswrapper[4835]: I1002 11:55:44.263786 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" path="/var/lib/kubelet/pods/ca8cc213-8ef4-4e95-b490-3b614b44e95c/volumes" Oct 02 11:56:11 crc kubenswrapper[4835]: I1002 11:56:11.984360 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:56:11 crc kubenswrapper[4835]: I1002 11:56:11.985070 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:56:11 crc kubenswrapper[4835]: I1002 11:56:11.985126 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:56:11 crc kubenswrapper[4835]: I1002 11:56:11.985978 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"226253d9d759d74973616d813e5b5469b61bc70d02466d6f60a4aa36a28fe914"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:56:11 crc kubenswrapper[4835]: I1002 11:56:11.986039 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://226253d9d759d74973616d813e5b5469b61bc70d02466d6f60a4aa36a28fe914" gracePeriod=600 Oct 02 11:56:12 crc kubenswrapper[4835]: I1002 11:56:12.667833 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="226253d9d759d74973616d813e5b5469b61bc70d02466d6f60a4aa36a28fe914" exitCode=0 Oct 02 11:56:12 crc kubenswrapper[4835]: I1002 11:56:12.667908 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"226253d9d759d74973616d813e5b5469b61bc70d02466d6f60a4aa36a28fe914"} Oct 02 11:56:12 crc kubenswrapper[4835]: I1002 11:56:12.668174 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15"} Oct 02 11:56:12 crc kubenswrapper[4835]: I1002 11:56:12.668194 4835 scope.go:117] "RemoveContainer" containerID="3ec7726a95047c357f120c97a94ad4c4fbe1b256ca257739c72b160ce543c4fe" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.291972 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hnsw6"] Oct 02 11:56:32 crc kubenswrapper[4835]: E1002 11:56:32.292942 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" containerName="extract-content" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.292960 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" containerName="extract-content" Oct 02 11:56:32 crc kubenswrapper[4835]: E1002 11:56:32.292974 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" containerName="registry-server" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.292984 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" containerName="registry-server" Oct 02 11:56:32 crc kubenswrapper[4835]: E1002 11:56:32.293024 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" containerName="extract-utilities" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.293033 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" containerName="extract-utilities" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.293275 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca8cc213-8ef4-4e95-b490-3b614b44e95c" containerName="registry-server" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.294913 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.333485 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnsw6"] Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.419913 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-catalog-content\") pod \"redhat-marketplace-hnsw6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.420000 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-utilities\") pod \"redhat-marketplace-hnsw6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.420050 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7nwq\" (UniqueName: \"kubernetes.io/projected/b185ece9-142f-46b4-b6d7-875acaa9e7a6-kube-api-access-q7nwq\") pod \"redhat-marketplace-hnsw6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.522493 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-catalog-content\") pod \"redhat-marketplace-hnsw6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.522591 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7nwq\" (UniqueName: \"kubernetes.io/projected/b185ece9-142f-46b4-b6d7-875acaa9e7a6-kube-api-access-q7nwq\") pod \"redhat-marketplace-hnsw6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.522616 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-utilities\") pod \"redhat-marketplace-hnsw6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.523201 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-utilities\") pod \"redhat-marketplace-hnsw6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.523429 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-catalog-content\") pod \"redhat-marketplace-hnsw6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.543820 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7nwq\" (UniqueName: \"kubernetes.io/projected/b185ece9-142f-46b4-b6d7-875acaa9e7a6-kube-api-access-q7nwq\") pod \"redhat-marketplace-hnsw6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:32 crc kubenswrapper[4835]: I1002 11:56:32.635287 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:33 crc kubenswrapper[4835]: I1002 11:56:33.106637 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnsw6"] Oct 02 11:56:33 crc kubenswrapper[4835]: I1002 11:56:33.873323 4835 generic.go:334] "Generic (PLEG): container finished" podID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" containerID="d837b0a3ed03d32971c30a744aaa05c6e77300084a77f66bc6a6b97bdc384b25" exitCode=0 Oct 02 11:56:33 crc kubenswrapper[4835]: I1002 11:56:33.873473 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnsw6" event={"ID":"b185ece9-142f-46b4-b6d7-875acaa9e7a6","Type":"ContainerDied","Data":"d837b0a3ed03d32971c30a744aaa05c6e77300084a77f66bc6a6b97bdc384b25"} Oct 02 11:56:33 crc kubenswrapper[4835]: I1002 11:56:33.873895 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnsw6" event={"ID":"b185ece9-142f-46b4-b6d7-875acaa9e7a6","Type":"ContainerStarted","Data":"bb2763d1212688694c5f41274825522cdfab79b1c3a1691255465f838884bda0"} Oct 02 11:56:34 crc kubenswrapper[4835]: I1002 11:56:34.885855 4835 generic.go:334] "Generic (PLEG): container finished" podID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" containerID="0defd188ac7da2f23d16bd1c8a27910d5c7a55cb3cbb07d94339cc6ebc108202" exitCode=0 Oct 02 11:56:34 crc kubenswrapper[4835]: I1002 11:56:34.885952 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnsw6" event={"ID":"b185ece9-142f-46b4-b6d7-875acaa9e7a6","Type":"ContainerDied","Data":"0defd188ac7da2f23d16bd1c8a27910d5c7a55cb3cbb07d94339cc6ebc108202"} Oct 02 11:56:35 crc kubenswrapper[4835]: I1002 11:56:35.898926 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnsw6" event={"ID":"b185ece9-142f-46b4-b6d7-875acaa9e7a6","Type":"ContainerStarted","Data":"7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e"} Oct 02 11:56:35 crc kubenswrapper[4835]: I1002 11:56:35.922724 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hnsw6" podStartSLOduration=2.442039193 podStartE2EDuration="3.922703265s" podCreationTimestamp="2025-10-02 11:56:32 +0000 UTC" firstStartedPulling="2025-10-02 11:56:33.876632172 +0000 UTC m=+3670.436539793" lastFinishedPulling="2025-10-02 11:56:35.357296284 +0000 UTC m=+3671.917203865" observedRunningTime="2025-10-02 11:56:35.921373267 +0000 UTC m=+3672.481280848" watchObservedRunningTime="2025-10-02 11:56:35.922703265 +0000 UTC m=+3672.482610846" Oct 02 11:56:42 crc kubenswrapper[4835]: I1002 11:56:42.635977 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:42 crc kubenswrapper[4835]: I1002 11:56:42.636666 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:42 crc kubenswrapper[4835]: I1002 11:56:42.713914 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:43 crc kubenswrapper[4835]: I1002 11:56:43.016090 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:43 crc kubenswrapper[4835]: I1002 11:56:43.061595 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnsw6"] Oct 02 11:56:44 crc kubenswrapper[4835]: I1002 11:56:44.985158 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hnsw6" podUID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" containerName="registry-server" containerID="cri-o://7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e" gracePeriod=2 Oct 02 11:56:45 crc kubenswrapper[4835]: I1002 11:56:45.628023 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:45 crc kubenswrapper[4835]: I1002 11:56:45.690975 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-catalog-content\") pod \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " Oct 02 11:56:45 crc kubenswrapper[4835]: I1002 11:56:45.691145 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7nwq\" (UniqueName: \"kubernetes.io/projected/b185ece9-142f-46b4-b6d7-875acaa9e7a6-kube-api-access-q7nwq\") pod \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " Oct 02 11:56:45 crc kubenswrapper[4835]: I1002 11:56:45.691202 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-utilities\") pod \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\" (UID: \"b185ece9-142f-46b4-b6d7-875acaa9e7a6\") " Oct 02 11:56:45 crc kubenswrapper[4835]: I1002 11:56:45.692631 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-utilities" (OuterVolumeSpecName: "utilities") pod "b185ece9-142f-46b4-b6d7-875acaa9e7a6" (UID: "b185ece9-142f-46b4-b6d7-875acaa9e7a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:56:45 crc kubenswrapper[4835]: I1002 11:56:45.704579 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b185ece9-142f-46b4-b6d7-875acaa9e7a6-kube-api-access-q7nwq" (OuterVolumeSpecName: "kube-api-access-q7nwq") pod "b185ece9-142f-46b4-b6d7-875acaa9e7a6" (UID: "b185ece9-142f-46b4-b6d7-875acaa9e7a6"). InnerVolumeSpecName "kube-api-access-q7nwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:56:45 crc kubenswrapper[4835]: I1002 11:56:45.707527 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b185ece9-142f-46b4-b6d7-875acaa9e7a6" (UID: "b185ece9-142f-46b4-b6d7-875acaa9e7a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:56:45 crc kubenswrapper[4835]: I1002 11:56:45.793509 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:56:45 crc kubenswrapper[4835]: I1002 11:56:45.793541 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7nwq\" (UniqueName: \"kubernetes.io/projected/b185ece9-142f-46b4-b6d7-875acaa9e7a6-kube-api-access-q7nwq\") on node \"crc\" DevicePath \"\"" Oct 02 11:56:45 crc kubenswrapper[4835]: I1002 11:56:45.793552 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b185ece9-142f-46b4-b6d7-875acaa9e7a6-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.000486 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnsw6" Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.002341 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnsw6" event={"ID":"b185ece9-142f-46b4-b6d7-875acaa9e7a6","Type":"ContainerDied","Data":"7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e"} Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.002458 4835 scope.go:117] "RemoveContainer" containerID="7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e" Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.000289 4835 generic.go:334] "Generic (PLEG): container finished" podID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" containerID="7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e" exitCode=0 Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.002925 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnsw6" event={"ID":"b185ece9-142f-46b4-b6d7-875acaa9e7a6","Type":"ContainerDied","Data":"bb2763d1212688694c5f41274825522cdfab79b1c3a1691255465f838884bda0"} Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.037725 4835 scope.go:117] "RemoveContainer" containerID="0defd188ac7da2f23d16bd1c8a27910d5c7a55cb3cbb07d94339cc6ebc108202" Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.047282 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnsw6"] Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.067295 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnsw6"] Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.079519 4835 scope.go:117] "RemoveContainer" containerID="d837b0a3ed03d32971c30a744aaa05c6e77300084a77f66bc6a6b97bdc384b25" Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.105879 4835 scope.go:117] "RemoveContainer" containerID="7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e" Oct 02 11:56:46 crc kubenswrapper[4835]: E1002 11:56:46.106323 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e\": container with ID starting with 7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e not found: ID does not exist" containerID="7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e" Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.106409 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e"} err="failed to get container status \"7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e\": rpc error: code = NotFound desc = could not find container \"7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e\": container with ID starting with 7536e8c539a3174cef782d0fb767410e697c3e6d420ce5cf219b509f187f2f9e not found: ID does not exist" Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.106438 4835 scope.go:117] "RemoveContainer" containerID="0defd188ac7da2f23d16bd1c8a27910d5c7a55cb3cbb07d94339cc6ebc108202" Oct 02 11:56:46 crc kubenswrapper[4835]: E1002 11:56:46.106726 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0defd188ac7da2f23d16bd1c8a27910d5c7a55cb3cbb07d94339cc6ebc108202\": container with ID starting with 0defd188ac7da2f23d16bd1c8a27910d5c7a55cb3cbb07d94339cc6ebc108202 not found: ID does not exist" containerID="0defd188ac7da2f23d16bd1c8a27910d5c7a55cb3cbb07d94339cc6ebc108202" Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.106762 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0defd188ac7da2f23d16bd1c8a27910d5c7a55cb3cbb07d94339cc6ebc108202"} err="failed to get container status \"0defd188ac7da2f23d16bd1c8a27910d5c7a55cb3cbb07d94339cc6ebc108202\": rpc error: code = NotFound desc = could not find container \"0defd188ac7da2f23d16bd1c8a27910d5c7a55cb3cbb07d94339cc6ebc108202\": container with ID starting with 0defd188ac7da2f23d16bd1c8a27910d5c7a55cb3cbb07d94339cc6ebc108202 not found: ID does not exist" Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.106781 4835 scope.go:117] "RemoveContainer" containerID="d837b0a3ed03d32971c30a744aaa05c6e77300084a77f66bc6a6b97bdc384b25" Oct 02 11:56:46 crc kubenswrapper[4835]: E1002 11:56:46.107112 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d837b0a3ed03d32971c30a744aaa05c6e77300084a77f66bc6a6b97bdc384b25\": container with ID starting with d837b0a3ed03d32971c30a744aaa05c6e77300084a77f66bc6a6b97bdc384b25 not found: ID does not exist" containerID="d837b0a3ed03d32971c30a744aaa05c6e77300084a77f66bc6a6b97bdc384b25" Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.107141 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d837b0a3ed03d32971c30a744aaa05c6e77300084a77f66bc6a6b97bdc384b25"} err="failed to get container status \"d837b0a3ed03d32971c30a744aaa05c6e77300084a77f66bc6a6b97bdc384b25\": rpc error: code = NotFound desc = could not find container \"d837b0a3ed03d32971c30a744aaa05c6e77300084a77f66bc6a6b97bdc384b25\": container with ID starting with d837b0a3ed03d32971c30a744aaa05c6e77300084a77f66bc6a6b97bdc384b25 not found: ID does not exist" Oct 02 11:56:46 crc kubenswrapper[4835]: I1002 11:56:46.262082 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" path="/var/lib/kubelet/pods/b185ece9-142f-46b4-b6d7-875acaa9e7a6/volumes" Oct 02 11:57:24 crc kubenswrapper[4835]: I1002 11:57:24.043968 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-s9l7s"] Oct 02 11:57:24 crc kubenswrapper[4835]: I1002 11:57:24.052034 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-s9l7s"] Oct 02 11:57:24 crc kubenswrapper[4835]: I1002 11:57:24.270246 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb924623-b065-4a9c-b3cb-635e87d989d8" path="/var/lib/kubelet/pods/bb924623-b065-4a9c-b3cb-635e87d989d8/volumes" Oct 02 11:57:36 crc kubenswrapper[4835]: I1002 11:57:36.027134 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-cef2-account-create-2tljw"] Oct 02 11:57:36 crc kubenswrapper[4835]: I1002 11:57:36.036642 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-cef2-account-create-2tljw"] Oct 02 11:57:36 crc kubenswrapper[4835]: I1002 11:57:36.264378 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b115bfa6-dbfa-4ef6-841b-7f48b78fadea" path="/var/lib/kubelet/pods/b115bfa6-dbfa-4ef6-841b-7f48b78fadea/volumes" Oct 02 11:57:59 crc kubenswrapper[4835]: I1002 11:57:59.075811 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-fqh6r"] Oct 02 11:57:59 crc kubenswrapper[4835]: I1002 11:57:59.089393 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-fqh6r"] Oct 02 11:58:00 crc kubenswrapper[4835]: I1002 11:58:00.272960 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65d5cda4-bd3f-4c36-93cc-1209c49e43ee" path="/var/lib/kubelet/pods/65d5cda4-bd3f-4c36-93cc-1209c49e43ee/volumes" Oct 02 11:58:00 crc kubenswrapper[4835]: I1002 11:58:00.818792 4835 scope.go:117] "RemoveContainer" containerID="e34305629e2eca88dc690f56867920716f3a340dcd93ca12703c32674aea3656" Oct 02 11:58:00 crc kubenswrapper[4835]: I1002 11:58:00.864266 4835 scope.go:117] "RemoveContainer" containerID="7e882b6662d6f77fe7d881d7c81e9c6ad4985594c1a6dbeb1ede8a2dc82299db" Oct 02 11:58:01 crc kubenswrapper[4835]: I1002 11:58:01.669093 4835 scope.go:117] "RemoveContainer" containerID="a612e14df782188cf432c28f0ba150087c2625ef8d263c73b32ddf23c7fde897" Oct 02 11:58:41 crc kubenswrapper[4835]: I1002 11:58:41.983823 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:58:41 crc kubenswrapper[4835]: I1002 11:58:41.984374 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:59:11 crc kubenswrapper[4835]: I1002 11:59:11.984614 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:59:11 crc kubenswrapper[4835]: I1002 11:59:11.985961 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:59:41 crc kubenswrapper[4835]: I1002 11:59:41.984185 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:59:41 crc kubenswrapper[4835]: I1002 11:59:41.984800 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:59:41 crc kubenswrapper[4835]: I1002 11:59:41.984860 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 11:59:41 crc kubenswrapper[4835]: I1002 11:59:41.985778 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:59:41 crc kubenswrapper[4835]: I1002 11:59:41.985947 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" gracePeriod=600 Oct 02 11:59:42 crc kubenswrapper[4835]: E1002 11:59:42.123624 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:59:42 crc kubenswrapper[4835]: I1002 11:59:42.625297 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" exitCode=0 Oct 02 11:59:42 crc kubenswrapper[4835]: I1002 11:59:42.625351 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15"} Oct 02 11:59:42 crc kubenswrapper[4835]: I1002 11:59:42.625389 4835 scope.go:117] "RemoveContainer" containerID="226253d9d759d74973616d813e5b5469b61bc70d02466d6f60a4aa36a28fe914" Oct 02 11:59:42 crc kubenswrapper[4835]: I1002 11:59:42.625988 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 11:59:42 crc kubenswrapper[4835]: E1002 11:59:42.626479 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 11:59:57 crc kubenswrapper[4835]: I1002 11:59:57.251538 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 11:59:57 crc kubenswrapper[4835]: E1002 11:59:57.253203 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.152677 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx"] Oct 02 12:00:00 crc kubenswrapper[4835]: E1002 12:00:00.153844 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" containerName="extract-content" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.153863 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" containerName="extract-content" Oct 02 12:00:00 crc kubenswrapper[4835]: E1002 12:00:00.153891 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.153898 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4835]: E1002 12:00:00.153916 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" containerName="extract-utilities" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.153926 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" containerName="extract-utilities" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.154175 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b185ece9-142f-46b4-b6d7-875acaa9e7a6" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.154838 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.157594 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.157658 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.163010 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx"] Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.292670 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqwfb\" (UniqueName: \"kubernetes.io/projected/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-kube-api-access-jqwfb\") pod \"collect-profiles-29323440-dz4gx\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.293033 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-config-volume\") pod \"collect-profiles-29323440-dz4gx\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.293118 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-secret-volume\") pod \"collect-profiles-29323440-dz4gx\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.395634 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-config-volume\") pod \"collect-profiles-29323440-dz4gx\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.396109 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-secret-volume\") pod \"collect-profiles-29323440-dz4gx\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.396173 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqwfb\" (UniqueName: \"kubernetes.io/projected/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-kube-api-access-jqwfb\") pod \"collect-profiles-29323440-dz4gx\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.397055 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-config-volume\") pod \"collect-profiles-29323440-dz4gx\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.407650 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-secret-volume\") pod \"collect-profiles-29323440-dz4gx\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.416161 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqwfb\" (UniqueName: \"kubernetes.io/projected/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-kube-api-access-jqwfb\") pod \"collect-profiles-29323440-dz4gx\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.510682 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:00 crc kubenswrapper[4835]: I1002 12:00:00.955638 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx"] Oct 02 12:00:01 crc kubenswrapper[4835]: I1002 12:00:01.813914 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" event={"ID":"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b","Type":"ContainerStarted","Data":"113a21e5088a9f04dbbee02cfa180a75cbfc917591e337052129021d22baa932"} Oct 02 12:00:01 crc kubenswrapper[4835]: I1002 12:00:01.815824 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" event={"ID":"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b","Type":"ContainerStarted","Data":"86cd4ff244579d3e1b0725a33ec3b5ba5297816fb4e9126d935d6d16b88e12eb"} Oct 02 12:00:01 crc kubenswrapper[4835]: I1002 12:00:01.838254 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" podStartSLOduration=1.838212826 podStartE2EDuration="1.838212826s" podCreationTimestamp="2025-10-02 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:00:01.832502913 +0000 UTC m=+3878.392410494" watchObservedRunningTime="2025-10-02 12:00:01.838212826 +0000 UTC m=+3878.398120407" Oct 02 12:00:02 crc kubenswrapper[4835]: I1002 12:00:02.824649 4835 generic.go:334] "Generic (PLEG): container finished" podID="2b297a85-4f3d-4f4e-91fc-ffa62f5e358b" containerID="113a21e5088a9f04dbbee02cfa180a75cbfc917591e337052129021d22baa932" exitCode=0 Oct 02 12:00:02 crc kubenswrapper[4835]: I1002 12:00:02.824785 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" event={"ID":"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b","Type":"ContainerDied","Data":"113a21e5088a9f04dbbee02cfa180a75cbfc917591e337052129021d22baa932"} Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.298593 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.386498 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-config-volume\") pod \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.386706 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-secret-volume\") pod \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.386742 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqwfb\" (UniqueName: \"kubernetes.io/projected/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-kube-api-access-jqwfb\") pod \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\" (UID: \"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b\") " Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.388021 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-config-volume" (OuterVolumeSpecName: "config-volume") pod "2b297a85-4f3d-4f4e-91fc-ffa62f5e358b" (UID: "2b297a85-4f3d-4f4e-91fc-ffa62f5e358b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.455295 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2b297a85-4f3d-4f4e-91fc-ffa62f5e358b" (UID: "2b297a85-4f3d-4f4e-91fc-ffa62f5e358b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.455362 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-kube-api-access-jqwfb" (OuterVolumeSpecName: "kube-api-access-jqwfb") pod "2b297a85-4f3d-4f4e-91fc-ffa62f5e358b" (UID: "2b297a85-4f3d-4f4e-91fc-ffa62f5e358b"). InnerVolumeSpecName "kube-api-access-jqwfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.489212 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.489309 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.489322 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqwfb\" (UniqueName: \"kubernetes.io/projected/2b297a85-4f3d-4f4e-91fc-ffa62f5e358b-kube-api-access-jqwfb\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.842888 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" event={"ID":"2b297a85-4f3d-4f4e-91fc-ffa62f5e358b","Type":"ContainerDied","Data":"86cd4ff244579d3e1b0725a33ec3b5ba5297816fb4e9126d935d6d16b88e12eb"} Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.842943 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86cd4ff244579d3e1b0725a33ec3b5ba5297816fb4e9126d935d6d16b88e12eb" Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.843005 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-dz4gx" Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.902344 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24"] Oct 02 12:00:04 crc kubenswrapper[4835]: I1002 12:00:04.909010 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-cdw24"] Oct 02 12:00:06 crc kubenswrapper[4835]: I1002 12:00:06.263636 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2905629f-e865-4e05-a222-a84e1fa0b88a" path="/var/lib/kubelet/pods/2905629f-e865-4e05-a222-a84e1fa0b88a/volumes" Oct 02 12:00:08 crc kubenswrapper[4835]: I1002 12:00:08.252765 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:00:08 crc kubenswrapper[4835]: E1002 12:00:08.253513 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:00:21 crc kubenswrapper[4835]: I1002 12:00:21.252012 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:00:21 crc kubenswrapper[4835]: E1002 12:00:21.252896 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:00:36 crc kubenswrapper[4835]: I1002 12:00:36.254002 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:00:36 crc kubenswrapper[4835]: E1002 12:00:36.254822 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:00:48 crc kubenswrapper[4835]: I1002 12:00:48.252488 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:00:48 crc kubenswrapper[4835]: E1002 12:00:48.253410 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.148216 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29323441-b6crb"] Oct 02 12:01:00 crc kubenswrapper[4835]: E1002 12:01:00.149118 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b297a85-4f3d-4f4e-91fc-ffa62f5e358b" containerName="collect-profiles" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.149132 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b297a85-4f3d-4f4e-91fc-ffa62f5e358b" containerName="collect-profiles" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.149336 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b297a85-4f3d-4f4e-91fc-ffa62f5e358b" containerName="collect-profiles" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.150014 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.159079 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323441-b6crb"] Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.245807 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-combined-ca-bundle\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.245895 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9bjc\" (UniqueName: \"kubernetes.io/projected/cfbb9e42-19fe-4ab0-9b02-38adc586df01-kube-api-access-n9bjc\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.245921 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-config-data\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.245943 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-fernet-keys\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.347442 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-combined-ca-bundle\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.347607 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9bjc\" (UniqueName: \"kubernetes.io/projected/cfbb9e42-19fe-4ab0-9b02-38adc586df01-kube-api-access-n9bjc\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.347635 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-config-data\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.347651 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-fernet-keys\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.354857 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-fernet-keys\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.375026 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-combined-ca-bundle\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.375899 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-config-data\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.378845 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9bjc\" (UniqueName: \"kubernetes.io/projected/cfbb9e42-19fe-4ab0-9b02-38adc586df01-kube-api-access-n9bjc\") pod \"keystone-cron-29323441-b6crb\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.479743 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:00 crc kubenswrapper[4835]: I1002 12:01:00.922200 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323441-b6crb"] Oct 02 12:01:01 crc kubenswrapper[4835]: I1002 12:01:01.327627 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323441-b6crb" event={"ID":"cfbb9e42-19fe-4ab0-9b02-38adc586df01","Type":"ContainerStarted","Data":"397c25c63c7782b223d4c4563dd440229a1119ef5a99dd708e4a868e7767982d"} Oct 02 12:01:01 crc kubenswrapper[4835]: I1002 12:01:01.327990 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323441-b6crb" event={"ID":"cfbb9e42-19fe-4ab0-9b02-38adc586df01","Type":"ContainerStarted","Data":"eeb54ec4c19b535e879947963258489709a65acec208f0e47755a4d108bb34b6"} Oct 02 12:01:01 crc kubenswrapper[4835]: I1002 12:01:01.349137 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29323441-b6crb" podStartSLOduration=1.349112029 podStartE2EDuration="1.349112029s" podCreationTimestamp="2025-10-02 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:01:01.342081298 +0000 UTC m=+3937.901988889" watchObservedRunningTime="2025-10-02 12:01:01.349112029 +0000 UTC m=+3937.909019610" Oct 02 12:01:01 crc kubenswrapper[4835]: I1002 12:01:01.813496 4835 scope.go:117] "RemoveContainer" containerID="e9904b51882f3055edd07d5c0bde074ccce42910ec6e682b9f1ed535ee701864" Oct 02 12:01:03 crc kubenswrapper[4835]: I1002 12:01:03.252069 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:01:03 crc kubenswrapper[4835]: E1002 12:01:03.252732 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:01:04 crc kubenswrapper[4835]: I1002 12:01:04.356714 4835 generic.go:334] "Generic (PLEG): container finished" podID="cfbb9e42-19fe-4ab0-9b02-38adc586df01" containerID="397c25c63c7782b223d4c4563dd440229a1119ef5a99dd708e4a868e7767982d" exitCode=0 Oct 02 12:01:04 crc kubenswrapper[4835]: I1002 12:01:04.356796 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323441-b6crb" event={"ID":"cfbb9e42-19fe-4ab0-9b02-38adc586df01","Type":"ContainerDied","Data":"397c25c63c7782b223d4c4563dd440229a1119ef5a99dd708e4a868e7767982d"} Oct 02 12:01:05 crc kubenswrapper[4835]: I1002 12:01:05.883468 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.061166 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-config-data\") pod \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.061303 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-fernet-keys\") pod \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.061399 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-combined-ca-bundle\") pod \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.061469 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9bjc\" (UniqueName: \"kubernetes.io/projected/cfbb9e42-19fe-4ab0-9b02-38adc586df01-kube-api-access-n9bjc\") pod \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\" (UID: \"cfbb9e42-19fe-4ab0-9b02-38adc586df01\") " Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.068800 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfbb9e42-19fe-4ab0-9b02-38adc586df01-kube-api-access-n9bjc" (OuterVolumeSpecName: "kube-api-access-n9bjc") pod "cfbb9e42-19fe-4ab0-9b02-38adc586df01" (UID: "cfbb9e42-19fe-4ab0-9b02-38adc586df01"). InnerVolumeSpecName "kube-api-access-n9bjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.085326 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cfbb9e42-19fe-4ab0-9b02-38adc586df01" (UID: "cfbb9e42-19fe-4ab0-9b02-38adc586df01"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.125899 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfbb9e42-19fe-4ab0-9b02-38adc586df01" (UID: "cfbb9e42-19fe-4ab0-9b02-38adc586df01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.139495 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-config-data" (OuterVolumeSpecName: "config-data") pod "cfbb9e42-19fe-4ab0-9b02-38adc586df01" (UID: "cfbb9e42-19fe-4ab0-9b02-38adc586df01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.169660 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.169694 4835 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.169703 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbb9e42-19fe-4ab0-9b02-38adc586df01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.169714 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9bjc\" (UniqueName: \"kubernetes.io/projected/cfbb9e42-19fe-4ab0-9b02-38adc586df01-kube-api-access-n9bjc\") on node \"crc\" DevicePath \"\"" Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.374002 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323441-b6crb" event={"ID":"cfbb9e42-19fe-4ab0-9b02-38adc586df01","Type":"ContainerDied","Data":"eeb54ec4c19b535e879947963258489709a65acec208f0e47755a4d108bb34b6"} Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.374035 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323441-b6crb" Oct 02 12:01:06 crc kubenswrapper[4835]: I1002 12:01:06.374045 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeb54ec4c19b535e879947963258489709a65acec208f0e47755a4d108bb34b6" Oct 02 12:01:18 crc kubenswrapper[4835]: I1002 12:01:18.251412 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:01:18 crc kubenswrapper[4835]: E1002 12:01:18.252013 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:01:30 crc kubenswrapper[4835]: I1002 12:01:30.252127 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:01:30 crc kubenswrapper[4835]: E1002 12:01:30.252946 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:01:43 crc kubenswrapper[4835]: I1002 12:01:43.252627 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:01:43 crc kubenswrapper[4835]: E1002 12:01:43.253387 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:01:56 crc kubenswrapper[4835]: I1002 12:01:56.251805 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:01:56 crc kubenswrapper[4835]: E1002 12:01:56.253904 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:02:07 crc kubenswrapper[4835]: I1002 12:02:07.252625 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:02:07 crc kubenswrapper[4835]: E1002 12:02:07.253520 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:02:20 crc kubenswrapper[4835]: I1002 12:02:20.252363 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:02:20 crc kubenswrapper[4835]: E1002 12:02:20.253106 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:02:22 crc kubenswrapper[4835]: I1002 12:02:22.869314 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w2k8n"] Oct 02 12:02:22 crc kubenswrapper[4835]: E1002 12:02:22.873722 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfbb9e42-19fe-4ab0-9b02-38adc586df01" containerName="keystone-cron" Oct 02 12:02:22 crc kubenswrapper[4835]: I1002 12:02:22.873841 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfbb9e42-19fe-4ab0-9b02-38adc586df01" containerName="keystone-cron" Oct 02 12:02:22 crc kubenswrapper[4835]: I1002 12:02:22.874150 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfbb9e42-19fe-4ab0-9b02-38adc586df01" containerName="keystone-cron" Oct 02 12:02:22 crc kubenswrapper[4835]: I1002 12:02:22.875886 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:22 crc kubenswrapper[4835]: I1002 12:02:22.883122 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2k8n"] Oct 02 12:02:22 crc kubenswrapper[4835]: I1002 12:02:22.995282 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-catalog-content\") pod \"redhat-operators-w2k8n\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:22 crc kubenswrapper[4835]: I1002 12:02:22.995331 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wptzk\" (UniqueName: \"kubernetes.io/projected/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-kube-api-access-wptzk\") pod \"redhat-operators-w2k8n\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:22 crc kubenswrapper[4835]: I1002 12:02:22.995372 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-utilities\") pod \"redhat-operators-w2k8n\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:23 crc kubenswrapper[4835]: I1002 12:02:23.097852 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-catalog-content\") pod \"redhat-operators-w2k8n\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:23 crc kubenswrapper[4835]: I1002 12:02:23.097901 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wptzk\" (UniqueName: \"kubernetes.io/projected/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-kube-api-access-wptzk\") pod \"redhat-operators-w2k8n\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:23 crc kubenswrapper[4835]: I1002 12:02:23.097941 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-utilities\") pod \"redhat-operators-w2k8n\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:23 crc kubenswrapper[4835]: I1002 12:02:23.098531 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-catalog-content\") pod \"redhat-operators-w2k8n\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:23 crc kubenswrapper[4835]: I1002 12:02:23.098593 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-utilities\") pod \"redhat-operators-w2k8n\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:23 crc kubenswrapper[4835]: I1002 12:02:23.116574 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wptzk\" (UniqueName: \"kubernetes.io/projected/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-kube-api-access-wptzk\") pod \"redhat-operators-w2k8n\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:23 crc kubenswrapper[4835]: I1002 12:02:23.203230 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:23 crc kubenswrapper[4835]: I1002 12:02:23.719170 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2k8n"] Oct 02 12:02:24 crc kubenswrapper[4835]: I1002 12:02:24.028894 4835 generic.go:334] "Generic (PLEG): container finished" podID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" containerID="2dc53e0b48ce3071610d73b9017a719308afc3c984f0a3bdb5750c7070e57760" exitCode=0 Oct 02 12:02:24 crc kubenswrapper[4835]: I1002 12:02:24.028995 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2k8n" event={"ID":"3b7cfbb0-2e3a-4614-978e-1f5142b583f1","Type":"ContainerDied","Data":"2dc53e0b48ce3071610d73b9017a719308afc3c984f0a3bdb5750c7070e57760"} Oct 02 12:02:24 crc kubenswrapper[4835]: I1002 12:02:24.029286 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2k8n" event={"ID":"3b7cfbb0-2e3a-4614-978e-1f5142b583f1","Type":"ContainerStarted","Data":"44c25d05a9a1fc1e831ecbfefaad113d598b731d634dfe28bdd180360521bf0f"} Oct 02 12:02:24 crc kubenswrapper[4835]: I1002 12:02:24.030964 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:02:26 crc kubenswrapper[4835]: I1002 12:02:26.046679 4835 generic.go:334] "Generic (PLEG): container finished" podID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" containerID="99dcf263e64797e8573f2d13e72da012db5bcf7e41001fb643627303ac1854eb" exitCode=0 Oct 02 12:02:26 crc kubenswrapper[4835]: I1002 12:02:26.046781 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2k8n" event={"ID":"3b7cfbb0-2e3a-4614-978e-1f5142b583f1","Type":"ContainerDied","Data":"99dcf263e64797e8573f2d13e72da012db5bcf7e41001fb643627303ac1854eb"} Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.051191 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m8vfv"] Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.054083 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.067949 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2k8n" event={"ID":"3b7cfbb0-2e3a-4614-978e-1f5142b583f1","Type":"ContainerStarted","Data":"b959a223e962c743a4f6efc30ce94a5c44e0e214657e19c4b75a3193cd6a2523"} Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.069654 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8vfv"] Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.110298 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w2k8n" podStartSLOduration=2.468586632 podStartE2EDuration="5.11026028s" podCreationTimestamp="2025-10-02 12:02:22 +0000 UTC" firstStartedPulling="2025-10-02 12:02:24.030575407 +0000 UTC m=+4020.590482988" lastFinishedPulling="2025-10-02 12:02:26.672249055 +0000 UTC m=+4023.232156636" observedRunningTime="2025-10-02 12:02:27.105299988 +0000 UTC m=+4023.665207569" watchObservedRunningTime="2025-10-02 12:02:27.11026028 +0000 UTC m=+4023.670167861" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.176344 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-catalog-content\") pod \"community-operators-m8vfv\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.176397 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-utilities\") pod \"community-operators-m8vfv\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.176553 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjw74\" (UniqueName: \"kubernetes.io/projected/66c96dda-7759-47d8-a1d6-e6ebc49351cb-kube-api-access-zjw74\") pod \"community-operators-m8vfv\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.278012 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-catalog-content\") pod \"community-operators-m8vfv\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.278072 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-utilities\") pod \"community-operators-m8vfv\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.278156 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjw74\" (UniqueName: \"kubernetes.io/projected/66c96dda-7759-47d8-a1d6-e6ebc49351cb-kube-api-access-zjw74\") pod \"community-operators-m8vfv\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.278874 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-catalog-content\") pod \"community-operators-m8vfv\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.279090 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-utilities\") pod \"community-operators-m8vfv\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.313139 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjw74\" (UniqueName: \"kubernetes.io/projected/66c96dda-7759-47d8-a1d6-e6ebc49351cb-kube-api-access-zjw74\") pod \"community-operators-m8vfv\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.381815 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:27 crc kubenswrapper[4835]: I1002 12:02:27.935864 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8vfv"] Oct 02 12:02:28 crc kubenswrapper[4835]: I1002 12:02:28.078085 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8vfv" event={"ID":"66c96dda-7759-47d8-a1d6-e6ebc49351cb","Type":"ContainerStarted","Data":"063000d00a1447625618aa3d6904e72b68926e730fdd301c4a767a845babd6a2"} Oct 02 12:02:29 crc kubenswrapper[4835]: I1002 12:02:29.088440 4835 generic.go:334] "Generic (PLEG): container finished" podID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" containerID="fc97ec6f8e260cafadd2e1a5a80a963471da367db14a8b4804ce57755391ed2b" exitCode=0 Oct 02 12:02:29 crc kubenswrapper[4835]: I1002 12:02:29.088504 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8vfv" event={"ID":"66c96dda-7759-47d8-a1d6-e6ebc49351cb","Type":"ContainerDied","Data":"fc97ec6f8e260cafadd2e1a5a80a963471da367db14a8b4804ce57755391ed2b"} Oct 02 12:02:30 crc kubenswrapper[4835]: I1002 12:02:30.099467 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8vfv" event={"ID":"66c96dda-7759-47d8-a1d6-e6ebc49351cb","Type":"ContainerStarted","Data":"e6a73cd5c0ec56495edf59df92b6fbe31759c7109b153d2fb42ee038addb7793"} Oct 02 12:02:31 crc kubenswrapper[4835]: I1002 12:02:31.109308 4835 generic.go:334] "Generic (PLEG): container finished" podID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" containerID="e6a73cd5c0ec56495edf59df92b6fbe31759c7109b153d2fb42ee038addb7793" exitCode=0 Oct 02 12:02:31 crc kubenswrapper[4835]: I1002 12:02:31.109403 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8vfv" event={"ID":"66c96dda-7759-47d8-a1d6-e6ebc49351cb","Type":"ContainerDied","Data":"e6a73cd5c0ec56495edf59df92b6fbe31759c7109b153d2fb42ee038addb7793"} Oct 02 12:02:32 crc kubenswrapper[4835]: I1002 12:02:32.134513 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8vfv" event={"ID":"66c96dda-7759-47d8-a1d6-e6ebc49351cb","Type":"ContainerStarted","Data":"f0ff75e949b18e3062e7ed4859c916755d0185990cd0f8569283e6dfa288ba19"} Oct 02 12:02:32 crc kubenswrapper[4835]: I1002 12:02:32.196039 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m8vfv" podStartSLOduration=2.585032214 podStartE2EDuration="5.196017645s" podCreationTimestamp="2025-10-02 12:02:27 +0000 UTC" firstStartedPulling="2025-10-02 12:02:29.090566287 +0000 UTC m=+4025.650473878" lastFinishedPulling="2025-10-02 12:02:31.701551728 +0000 UTC m=+4028.261459309" observedRunningTime="2025-10-02 12:02:32.190403815 +0000 UTC m=+4028.750311426" watchObservedRunningTime="2025-10-02 12:02:32.196017645 +0000 UTC m=+4028.755925226" Oct 02 12:02:33 crc kubenswrapper[4835]: I1002 12:02:33.204300 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:33 crc kubenswrapper[4835]: I1002 12:02:33.204608 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:33 crc kubenswrapper[4835]: I1002 12:02:33.258100 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:34 crc kubenswrapper[4835]: I1002 12:02:34.200757 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:34 crc kubenswrapper[4835]: I1002 12:02:34.252048 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:02:34 crc kubenswrapper[4835]: E1002 12:02:34.253351 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:02:34 crc kubenswrapper[4835]: I1002 12:02:34.643827 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w2k8n"] Oct 02 12:02:36 crc kubenswrapper[4835]: I1002 12:02:36.166410 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w2k8n" podUID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" containerName="registry-server" containerID="cri-o://b959a223e962c743a4f6efc30ce94a5c44e0e214657e19c4b75a3193cd6a2523" gracePeriod=2 Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.181556 4835 generic.go:334] "Generic (PLEG): container finished" podID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" containerID="b959a223e962c743a4f6efc30ce94a5c44e0e214657e19c4b75a3193cd6a2523" exitCode=0 Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.181603 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2k8n" event={"ID":"3b7cfbb0-2e3a-4614-978e-1f5142b583f1","Type":"ContainerDied","Data":"b959a223e962c743a4f6efc30ce94a5c44e0e214657e19c4b75a3193cd6a2523"} Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.382945 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.382997 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.433952 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.475647 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.610307 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-catalog-content\") pod \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.610386 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wptzk\" (UniqueName: \"kubernetes.io/projected/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-kube-api-access-wptzk\") pod \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.610509 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-utilities\") pod \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\" (UID: \"3b7cfbb0-2e3a-4614-978e-1f5142b583f1\") " Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.611412 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-utilities" (OuterVolumeSpecName: "utilities") pod "3b7cfbb0-2e3a-4614-978e-1f5142b583f1" (UID: "3b7cfbb0-2e3a-4614-978e-1f5142b583f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.611633 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.615765 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-kube-api-access-wptzk" (OuterVolumeSpecName: "kube-api-access-wptzk") pod "3b7cfbb0-2e3a-4614-978e-1f5142b583f1" (UID: "3b7cfbb0-2e3a-4614-978e-1f5142b583f1"). InnerVolumeSpecName "kube-api-access-wptzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.692467 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b7cfbb0-2e3a-4614-978e-1f5142b583f1" (UID: "3b7cfbb0-2e3a-4614-978e-1f5142b583f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.713658 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:37 crc kubenswrapper[4835]: I1002 12:02:37.713706 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wptzk\" (UniqueName: \"kubernetes.io/projected/3b7cfbb0-2e3a-4614-978e-1f5142b583f1-kube-api-access-wptzk\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:38 crc kubenswrapper[4835]: I1002 12:02:38.193842 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2k8n" event={"ID":"3b7cfbb0-2e3a-4614-978e-1f5142b583f1","Type":"ContainerDied","Data":"44c25d05a9a1fc1e831ecbfefaad113d598b731d634dfe28bdd180360521bf0f"} Oct 02 12:02:38 crc kubenswrapper[4835]: I1002 12:02:38.194199 4835 scope.go:117] "RemoveContainer" containerID="b959a223e962c743a4f6efc30ce94a5c44e0e214657e19c4b75a3193cd6a2523" Oct 02 12:02:38 crc kubenswrapper[4835]: I1002 12:02:38.193906 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2k8n" Oct 02 12:02:38 crc kubenswrapper[4835]: I1002 12:02:38.227745 4835 scope.go:117] "RemoveContainer" containerID="99dcf263e64797e8573f2d13e72da012db5bcf7e41001fb643627303ac1854eb" Oct 02 12:02:38 crc kubenswrapper[4835]: I1002 12:02:38.228103 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w2k8n"] Oct 02 12:02:38 crc kubenswrapper[4835]: I1002 12:02:38.236763 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w2k8n"] Oct 02 12:02:38 crc kubenswrapper[4835]: I1002 12:02:38.266293 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" path="/var/lib/kubelet/pods/3b7cfbb0-2e3a-4614-978e-1f5142b583f1/volumes" Oct 02 12:02:38 crc kubenswrapper[4835]: I1002 12:02:38.275243 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:38 crc kubenswrapper[4835]: I1002 12:02:38.275444 4835 scope.go:117] "RemoveContainer" containerID="2dc53e0b48ce3071610d73b9017a719308afc3c984f0a3bdb5750c7070e57760" Oct 02 12:02:38 crc kubenswrapper[4835]: I1002 12:02:38.841773 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m8vfv"] Oct 02 12:02:40 crc kubenswrapper[4835]: I1002 12:02:40.213635 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m8vfv" podUID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" containerName="registry-server" containerID="cri-o://f0ff75e949b18e3062e7ed4859c916755d0185990cd0f8569283e6dfa288ba19" gracePeriod=2 Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.229214 4835 generic.go:334] "Generic (PLEG): container finished" podID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" containerID="f0ff75e949b18e3062e7ed4859c916755d0185990cd0f8569283e6dfa288ba19" exitCode=0 Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.229410 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8vfv" event={"ID":"66c96dda-7759-47d8-a1d6-e6ebc49351cb","Type":"ContainerDied","Data":"f0ff75e949b18e3062e7ed4859c916755d0185990cd0f8569283e6dfa288ba19"} Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.468935 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.594717 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-catalog-content\") pod \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.595259 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjw74\" (UniqueName: \"kubernetes.io/projected/66c96dda-7759-47d8-a1d6-e6ebc49351cb-kube-api-access-zjw74\") pod \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.595317 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-utilities\") pod \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\" (UID: \"66c96dda-7759-47d8-a1d6-e6ebc49351cb\") " Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.596147 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-utilities" (OuterVolumeSpecName: "utilities") pod "66c96dda-7759-47d8-a1d6-e6ebc49351cb" (UID: "66c96dda-7759-47d8-a1d6-e6ebc49351cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.601177 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c96dda-7759-47d8-a1d6-e6ebc49351cb-kube-api-access-zjw74" (OuterVolumeSpecName: "kube-api-access-zjw74") pod "66c96dda-7759-47d8-a1d6-e6ebc49351cb" (UID: "66c96dda-7759-47d8-a1d6-e6ebc49351cb"). InnerVolumeSpecName "kube-api-access-zjw74". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.647727 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66c96dda-7759-47d8-a1d6-e6ebc49351cb" (UID: "66c96dda-7759-47d8-a1d6-e6ebc49351cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.698241 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.698281 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c96dda-7759-47d8-a1d6-e6ebc49351cb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:41 crc kubenswrapper[4835]: I1002 12:02:41.698295 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjw74\" (UniqueName: \"kubernetes.io/projected/66c96dda-7759-47d8-a1d6-e6ebc49351cb-kube-api-access-zjw74\") on node \"crc\" DevicePath \"\"" Oct 02 12:02:42 crc kubenswrapper[4835]: I1002 12:02:42.241965 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8vfv" event={"ID":"66c96dda-7759-47d8-a1d6-e6ebc49351cb","Type":"ContainerDied","Data":"063000d00a1447625618aa3d6904e72b68926e730fdd301c4a767a845babd6a2"} Oct 02 12:02:42 crc kubenswrapper[4835]: I1002 12:02:42.242039 4835 scope.go:117] "RemoveContainer" containerID="f0ff75e949b18e3062e7ed4859c916755d0185990cd0f8569283e6dfa288ba19" Oct 02 12:02:42 crc kubenswrapper[4835]: I1002 12:02:42.242100 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8vfv" Oct 02 12:02:42 crc kubenswrapper[4835]: I1002 12:02:42.280103 4835 scope.go:117] "RemoveContainer" containerID="e6a73cd5c0ec56495edf59df92b6fbe31759c7109b153d2fb42ee038addb7793" Oct 02 12:02:42 crc kubenswrapper[4835]: I1002 12:02:42.283712 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m8vfv"] Oct 02 12:02:42 crc kubenswrapper[4835]: I1002 12:02:42.303395 4835 scope.go:117] "RemoveContainer" containerID="fc97ec6f8e260cafadd2e1a5a80a963471da367db14a8b4804ce57755391ed2b" Oct 02 12:02:42 crc kubenswrapper[4835]: I1002 12:02:42.306698 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m8vfv"] Oct 02 12:02:44 crc kubenswrapper[4835]: I1002 12:02:44.262339 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" path="/var/lib/kubelet/pods/66c96dda-7759-47d8-a1d6-e6ebc49351cb/volumes" Oct 02 12:02:49 crc kubenswrapper[4835]: I1002 12:02:49.251947 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:02:49 crc kubenswrapper[4835]: E1002 12:02:49.252717 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:03:03 crc kubenswrapper[4835]: I1002 12:03:03.252525 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:03:03 crc kubenswrapper[4835]: E1002 12:03:03.253307 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:03:18 crc kubenswrapper[4835]: I1002 12:03:18.252053 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:03:18 crc kubenswrapper[4835]: E1002 12:03:18.252933 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:03:32 crc kubenswrapper[4835]: I1002 12:03:32.251807 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:03:32 crc kubenswrapper[4835]: E1002 12:03:32.252659 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:03:46 crc kubenswrapper[4835]: I1002 12:03:46.252044 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:03:46 crc kubenswrapper[4835]: E1002 12:03:46.252898 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:04:01 crc kubenswrapper[4835]: I1002 12:04:01.252412 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:04:01 crc kubenswrapper[4835]: E1002 12:04:01.253109 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:04:13 crc kubenswrapper[4835]: I1002 12:04:13.252440 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:04:13 crc kubenswrapper[4835]: E1002 12:04:13.253150 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:04:24 crc kubenswrapper[4835]: I1002 12:04:24.259369 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:04:24 crc kubenswrapper[4835]: E1002 12:04:24.260459 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:04:37 crc kubenswrapper[4835]: I1002 12:04:37.251679 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:04:37 crc kubenswrapper[4835]: E1002 12:04:37.252522 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:04:52 crc kubenswrapper[4835]: I1002 12:04:52.251611 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:04:53 crc kubenswrapper[4835]: I1002 12:04:53.438975 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"27ba56610f9ebb17349cd78b224974601c6901888b5a5f52e034e8244a9b0d30"} Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.104575 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9vs24"] Oct 02 12:06:10 crc kubenswrapper[4835]: E1002 12:06:10.105731 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" containerName="registry-server" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.105747 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" containerName="registry-server" Oct 02 12:06:10 crc kubenswrapper[4835]: E1002 12:06:10.105773 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" containerName="extract-content" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.105782 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" containerName="extract-content" Oct 02 12:06:10 crc kubenswrapper[4835]: E1002 12:06:10.105797 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" containerName="extract-utilities" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.105806 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" containerName="extract-utilities" Oct 02 12:06:10 crc kubenswrapper[4835]: E1002 12:06:10.105831 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" containerName="registry-server" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.105840 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" containerName="registry-server" Oct 02 12:06:10 crc kubenswrapper[4835]: E1002 12:06:10.105857 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" containerName="extract-utilities" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.105865 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" containerName="extract-utilities" Oct 02 12:06:10 crc kubenswrapper[4835]: E1002 12:06:10.105882 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" containerName="extract-content" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.105890 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" containerName="extract-content" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.106150 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c96dda-7759-47d8-a1d6-e6ebc49351cb" containerName="registry-server" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.106194 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b7cfbb0-2e3a-4614-978e-1f5142b583f1" containerName="registry-server" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.110350 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.141720 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vs24"] Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.302414 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-utilities\") pod \"certified-operators-9vs24\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.302856 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-catalog-content\") pod \"certified-operators-9vs24\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.302895 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qlk6\" (UniqueName: \"kubernetes.io/projected/e755d217-f5c9-4632-bf5e-0f53c628d414-kube-api-access-4qlk6\") pod \"certified-operators-9vs24\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.405263 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-utilities\") pod \"certified-operators-9vs24\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.405367 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-catalog-content\") pod \"certified-operators-9vs24\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.405406 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qlk6\" (UniqueName: \"kubernetes.io/projected/e755d217-f5c9-4632-bf5e-0f53c628d414-kube-api-access-4qlk6\") pod \"certified-operators-9vs24\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.406197 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-catalog-content\") pod \"certified-operators-9vs24\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.406350 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-utilities\") pod \"certified-operators-9vs24\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.442734 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qlk6\" (UniqueName: \"kubernetes.io/projected/e755d217-f5c9-4632-bf5e-0f53c628d414-kube-api-access-4qlk6\") pod \"certified-operators-9vs24\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:10 crc kubenswrapper[4835]: I1002 12:06:10.738539 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:11 crc kubenswrapper[4835]: I1002 12:06:11.284009 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vs24"] Oct 02 12:06:12 crc kubenswrapper[4835]: I1002 12:06:12.130499 4835 generic.go:334] "Generic (PLEG): container finished" podID="e755d217-f5c9-4632-bf5e-0f53c628d414" containerID="6d6c1afee4b32eacb7230d00224c82da1b6a1af39223b63dce2014527ed46a91" exitCode=0 Oct 02 12:06:12 crc kubenswrapper[4835]: I1002 12:06:12.130634 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vs24" event={"ID":"e755d217-f5c9-4632-bf5e-0f53c628d414","Type":"ContainerDied","Data":"6d6c1afee4b32eacb7230d00224c82da1b6a1af39223b63dce2014527ed46a91"} Oct 02 12:06:12 crc kubenswrapper[4835]: I1002 12:06:12.131128 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vs24" event={"ID":"e755d217-f5c9-4632-bf5e-0f53c628d414","Type":"ContainerStarted","Data":"1e506fad828e94917785c7707d4e3ef061003e1ae1ad54696286f7b735e501be"} Oct 02 12:06:14 crc kubenswrapper[4835]: I1002 12:06:14.151232 4835 generic.go:334] "Generic (PLEG): container finished" podID="e755d217-f5c9-4632-bf5e-0f53c628d414" containerID="a72ffd877653910cd224ffb903c2ff0d26cc484d63c6ecbb242ffb850ed1720f" exitCode=0 Oct 02 12:06:14 crc kubenswrapper[4835]: I1002 12:06:14.151323 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vs24" event={"ID":"e755d217-f5c9-4632-bf5e-0f53c628d414","Type":"ContainerDied","Data":"a72ffd877653910cd224ffb903c2ff0d26cc484d63c6ecbb242ffb850ed1720f"} Oct 02 12:06:15 crc kubenswrapper[4835]: I1002 12:06:15.188935 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vs24" event={"ID":"e755d217-f5c9-4632-bf5e-0f53c628d414","Type":"ContainerStarted","Data":"ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27"} Oct 02 12:06:15 crc kubenswrapper[4835]: I1002 12:06:15.215925 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9vs24" podStartSLOduration=2.668504676 podStartE2EDuration="5.215901204s" podCreationTimestamp="2025-10-02 12:06:10 +0000 UTC" firstStartedPulling="2025-10-02 12:06:12.13364566 +0000 UTC m=+4248.693553241" lastFinishedPulling="2025-10-02 12:06:14.681042188 +0000 UTC m=+4251.240949769" observedRunningTime="2025-10-02 12:06:15.208670928 +0000 UTC m=+4251.768578519" watchObservedRunningTime="2025-10-02 12:06:15.215901204 +0000 UTC m=+4251.775808785" Oct 02 12:06:20 crc kubenswrapper[4835]: I1002 12:06:20.738785 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:20 crc kubenswrapper[4835]: I1002 12:06:20.739349 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:20 crc kubenswrapper[4835]: I1002 12:06:20.792686 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:21 crc kubenswrapper[4835]: I1002 12:06:21.296693 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:21 crc kubenswrapper[4835]: I1002 12:06:21.351929 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vs24"] Oct 02 12:06:23 crc kubenswrapper[4835]: I1002 12:06:23.258175 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9vs24" podUID="e755d217-f5c9-4632-bf5e-0f53c628d414" containerName="registry-server" containerID="cri-o://ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27" gracePeriod=2 Oct 02 12:06:23 crc kubenswrapper[4835]: I1002 12:06:23.809986 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:23 crc kubenswrapper[4835]: I1002 12:06:23.820986 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qlk6\" (UniqueName: \"kubernetes.io/projected/e755d217-f5c9-4632-bf5e-0f53c628d414-kube-api-access-4qlk6\") pod \"e755d217-f5c9-4632-bf5e-0f53c628d414\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " Oct 02 12:06:23 crc kubenswrapper[4835]: I1002 12:06:23.821059 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-catalog-content\") pod \"e755d217-f5c9-4632-bf5e-0f53c628d414\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " Oct 02 12:06:23 crc kubenswrapper[4835]: I1002 12:06:23.821278 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-utilities\") pod \"e755d217-f5c9-4632-bf5e-0f53c628d414\" (UID: \"e755d217-f5c9-4632-bf5e-0f53c628d414\") " Oct 02 12:06:23 crc kubenswrapper[4835]: I1002 12:06:23.822196 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-utilities" (OuterVolumeSpecName: "utilities") pod "e755d217-f5c9-4632-bf5e-0f53c628d414" (UID: "e755d217-f5c9-4632-bf5e-0f53c628d414"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:06:23 crc kubenswrapper[4835]: I1002 12:06:23.824495 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:23 crc kubenswrapper[4835]: I1002 12:06:23.840476 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e755d217-f5c9-4632-bf5e-0f53c628d414-kube-api-access-4qlk6" (OuterVolumeSpecName: "kube-api-access-4qlk6") pod "e755d217-f5c9-4632-bf5e-0f53c628d414" (UID: "e755d217-f5c9-4632-bf5e-0f53c628d414"). InnerVolumeSpecName "kube-api-access-4qlk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:06:23 crc kubenswrapper[4835]: I1002 12:06:23.926036 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qlk6\" (UniqueName: \"kubernetes.io/projected/e755d217-f5c9-4632-bf5e-0f53c628d414-kube-api-access-4qlk6\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.028553 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e755d217-f5c9-4632-bf5e-0f53c628d414" (UID: "e755d217-f5c9-4632-bf5e-0f53c628d414"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.130432 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e755d217-f5c9-4632-bf5e-0f53c628d414-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.268277 4835 generic.go:334] "Generic (PLEG): container finished" podID="e755d217-f5c9-4632-bf5e-0f53c628d414" containerID="ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27" exitCode=0 Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.268315 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vs24" event={"ID":"e755d217-f5c9-4632-bf5e-0f53c628d414","Type":"ContainerDied","Data":"ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27"} Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.268347 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vs24" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.268367 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vs24" event={"ID":"e755d217-f5c9-4632-bf5e-0f53c628d414","Type":"ContainerDied","Data":"1e506fad828e94917785c7707d4e3ef061003e1ae1ad54696286f7b735e501be"} Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.268388 4835 scope.go:117] "RemoveContainer" containerID="ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.316389 4835 scope.go:117] "RemoveContainer" containerID="a72ffd877653910cd224ffb903c2ff0d26cc484d63c6ecbb242ffb850ed1720f" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.316634 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vs24"] Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.341033 4835 scope.go:117] "RemoveContainer" containerID="6d6c1afee4b32eacb7230d00224c82da1b6a1af39223b63dce2014527ed46a91" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.352989 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9vs24"] Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.381716 4835 scope.go:117] "RemoveContainer" containerID="ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27" Oct 02 12:06:24 crc kubenswrapper[4835]: E1002 12:06:24.382214 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27\": container with ID starting with ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27 not found: ID does not exist" containerID="ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.382276 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27"} err="failed to get container status \"ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27\": rpc error: code = NotFound desc = could not find container \"ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27\": container with ID starting with ef9b4a508f7c0996a9da4f30fb726e90f168456fcf255579b65122c58b7d5c27 not found: ID does not exist" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.382304 4835 scope.go:117] "RemoveContainer" containerID="a72ffd877653910cd224ffb903c2ff0d26cc484d63c6ecbb242ffb850ed1720f" Oct 02 12:06:24 crc kubenswrapper[4835]: E1002 12:06:24.384578 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a72ffd877653910cd224ffb903c2ff0d26cc484d63c6ecbb242ffb850ed1720f\": container with ID starting with a72ffd877653910cd224ffb903c2ff0d26cc484d63c6ecbb242ffb850ed1720f not found: ID does not exist" containerID="a72ffd877653910cd224ffb903c2ff0d26cc484d63c6ecbb242ffb850ed1720f" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.384609 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a72ffd877653910cd224ffb903c2ff0d26cc484d63c6ecbb242ffb850ed1720f"} err="failed to get container status \"a72ffd877653910cd224ffb903c2ff0d26cc484d63c6ecbb242ffb850ed1720f\": rpc error: code = NotFound desc = could not find container \"a72ffd877653910cd224ffb903c2ff0d26cc484d63c6ecbb242ffb850ed1720f\": container with ID starting with a72ffd877653910cd224ffb903c2ff0d26cc484d63c6ecbb242ffb850ed1720f not found: ID does not exist" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.384627 4835 scope.go:117] "RemoveContainer" containerID="6d6c1afee4b32eacb7230d00224c82da1b6a1af39223b63dce2014527ed46a91" Oct 02 12:06:24 crc kubenswrapper[4835]: E1002 12:06:24.385031 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6c1afee4b32eacb7230d00224c82da1b6a1af39223b63dce2014527ed46a91\": container with ID starting with 6d6c1afee4b32eacb7230d00224c82da1b6a1af39223b63dce2014527ed46a91 not found: ID does not exist" containerID="6d6c1afee4b32eacb7230d00224c82da1b6a1af39223b63dce2014527ed46a91" Oct 02 12:06:24 crc kubenswrapper[4835]: I1002 12:06:24.385061 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6c1afee4b32eacb7230d00224c82da1b6a1af39223b63dce2014527ed46a91"} err="failed to get container status \"6d6c1afee4b32eacb7230d00224c82da1b6a1af39223b63dce2014527ed46a91\": rpc error: code = NotFound desc = could not find container \"6d6c1afee4b32eacb7230d00224c82da1b6a1af39223b63dce2014527ed46a91\": container with ID starting with 6d6c1afee4b32eacb7230d00224c82da1b6a1af39223b63dce2014527ed46a91 not found: ID does not exist" Oct 02 12:06:25 crc kubenswrapper[4835]: E1002 12:06:25.340588 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode755d217_f5c9_4632_bf5e_0f53c628d414.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:06:26 crc kubenswrapper[4835]: I1002 12:06:26.262376 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e755d217-f5c9-4632-bf5e-0f53c628d414" path="/var/lib/kubelet/pods/e755d217-f5c9-4632-bf5e-0f53c628d414/volumes" Oct 02 12:06:35 crc kubenswrapper[4835]: E1002 12:06:35.599488 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode755d217_f5c9_4632_bf5e_0f53c628d414.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:06:45 crc kubenswrapper[4835]: E1002 12:06:45.880073 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode755d217_f5c9_4632_bf5e_0f53c628d414.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:06:56 crc kubenswrapper[4835]: E1002 12:06:56.141539 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode755d217_f5c9_4632_bf5e_0f53c628d414.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:07:06 crc kubenswrapper[4835]: E1002 12:07:06.381115 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode755d217_f5c9_4632_bf5e_0f53c628d414.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:07:11 crc kubenswrapper[4835]: I1002 12:07:11.984031 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:07:11 crc kubenswrapper[4835]: I1002 12:07:11.984565 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:07:16 crc kubenswrapper[4835]: E1002 12:07:16.619608 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode755d217_f5c9_4632_bf5e_0f53c628d414.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:07:41 crc kubenswrapper[4835]: I1002 12:07:41.984731 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:07:41 crc kubenswrapper[4835]: I1002 12:07:41.985345 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:08:11 crc kubenswrapper[4835]: I1002 12:08:11.983745 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:08:11 crc kubenswrapper[4835]: I1002 12:08:11.984365 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:08:11 crc kubenswrapper[4835]: I1002 12:08:11.984415 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 12:08:11 crc kubenswrapper[4835]: I1002 12:08:11.985194 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27ba56610f9ebb17349cd78b224974601c6901888b5a5f52e034e8244a9b0d30"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:08:11 crc kubenswrapper[4835]: I1002 12:08:11.985266 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://27ba56610f9ebb17349cd78b224974601c6901888b5a5f52e034e8244a9b0d30" gracePeriod=600 Oct 02 12:08:12 crc kubenswrapper[4835]: I1002 12:08:12.252316 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="27ba56610f9ebb17349cd78b224974601c6901888b5a5f52e034e8244a9b0d30" exitCode=0 Oct 02 12:08:12 crc kubenswrapper[4835]: I1002 12:08:12.261575 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"27ba56610f9ebb17349cd78b224974601c6901888b5a5f52e034e8244a9b0d30"} Oct 02 12:08:12 crc kubenswrapper[4835]: I1002 12:08:12.261668 4835 scope.go:117] "RemoveContainer" containerID="6c8d3c2a6fef8232c5e3419fb831f57107f4eb1b6268071881909ff0d1f0af15" Oct 02 12:08:13 crc kubenswrapper[4835]: I1002 12:08:13.264420 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24"} Oct 02 12:10:41 crc kubenswrapper[4835]: I1002 12:10:41.984112 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:10:41 crc kubenswrapper[4835]: I1002 12:10:41.984761 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:11:11 crc kubenswrapper[4835]: I1002 12:11:11.983687 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:11:11 crc kubenswrapper[4835]: I1002 12:11:11.984170 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:11:41 crc kubenswrapper[4835]: I1002 12:11:41.984699 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:11:41 crc kubenswrapper[4835]: I1002 12:11:41.985414 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:11:41 crc kubenswrapper[4835]: I1002 12:11:41.985472 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 12:11:41 crc kubenswrapper[4835]: I1002 12:11:41.986396 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:11:41 crc kubenswrapper[4835]: I1002 12:11:41.986473 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" gracePeriod=600 Oct 02 12:11:42 crc kubenswrapper[4835]: E1002 12:11:42.116339 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:11:43 crc kubenswrapper[4835]: I1002 12:11:43.116160 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" exitCode=0 Oct 02 12:11:43 crc kubenswrapper[4835]: I1002 12:11:43.116560 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24"} Oct 02 12:11:43 crc kubenswrapper[4835]: I1002 12:11:43.116603 4835 scope.go:117] "RemoveContainer" containerID="27ba56610f9ebb17349cd78b224974601c6901888b5a5f52e034e8244a9b0d30" Oct 02 12:11:43 crc kubenswrapper[4835]: I1002 12:11:43.117443 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:11:43 crc kubenswrapper[4835]: E1002 12:11:43.117880 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:11:55 crc kubenswrapper[4835]: I1002 12:11:55.251610 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:11:55 crc kubenswrapper[4835]: E1002 12:11:55.253453 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:12:09 crc kubenswrapper[4835]: I1002 12:12:09.252418 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:12:09 crc kubenswrapper[4835]: E1002 12:12:09.252972 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:12:20 crc kubenswrapper[4835]: I1002 12:12:20.251771 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:12:20 crc kubenswrapper[4835]: E1002 12:12:20.255187 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:12:31 crc kubenswrapper[4835]: I1002 12:12:31.251624 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:12:31 crc kubenswrapper[4835]: E1002 12:12:31.252432 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:12:42 crc kubenswrapper[4835]: I1002 12:12:42.252001 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:12:42 crc kubenswrapper[4835]: E1002 12:12:42.252843 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:12:56 crc kubenswrapper[4835]: I1002 12:12:56.252135 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:12:56 crc kubenswrapper[4835]: E1002 12:12:56.253085 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:13:07 crc kubenswrapper[4835]: I1002 12:13:07.253104 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:13:07 crc kubenswrapper[4835]: E1002 12:13:07.254541 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:13:13 crc kubenswrapper[4835]: I1002 12:13:13.892329 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9sxdq"] Oct 02 12:13:13 crc kubenswrapper[4835]: E1002 12:13:13.893459 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e755d217-f5c9-4632-bf5e-0f53c628d414" containerName="extract-utilities" Oct 02 12:13:13 crc kubenswrapper[4835]: I1002 12:13:13.893479 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e755d217-f5c9-4632-bf5e-0f53c628d414" containerName="extract-utilities" Oct 02 12:13:13 crc kubenswrapper[4835]: E1002 12:13:13.893513 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e755d217-f5c9-4632-bf5e-0f53c628d414" containerName="extract-content" Oct 02 12:13:13 crc kubenswrapper[4835]: I1002 12:13:13.893522 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e755d217-f5c9-4632-bf5e-0f53c628d414" containerName="extract-content" Oct 02 12:13:13 crc kubenswrapper[4835]: E1002 12:13:13.893548 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e755d217-f5c9-4632-bf5e-0f53c628d414" containerName="registry-server" Oct 02 12:13:13 crc kubenswrapper[4835]: I1002 12:13:13.893557 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e755d217-f5c9-4632-bf5e-0f53c628d414" containerName="registry-server" Oct 02 12:13:13 crc kubenswrapper[4835]: I1002 12:13:13.893852 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e755d217-f5c9-4632-bf5e-0f53c628d414" containerName="registry-server" Oct 02 12:13:13 crc kubenswrapper[4835]: I1002 12:13:13.895786 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:13 crc kubenswrapper[4835]: I1002 12:13:13.902448 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9sxdq"] Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.037171 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-utilities\") pod \"community-operators-9sxdq\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.037446 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr6pp\" (UniqueName: \"kubernetes.io/projected/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-kube-api-access-tr6pp\") pod \"community-operators-9sxdq\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.037543 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-catalog-content\") pod \"community-operators-9sxdq\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.139540 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-catalog-content\") pod \"community-operators-9sxdq\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.140024 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-utilities\") pod \"community-operators-9sxdq\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.140143 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr6pp\" (UniqueName: \"kubernetes.io/projected/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-kube-api-access-tr6pp\") pod \"community-operators-9sxdq\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.140491 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-catalog-content\") pod \"community-operators-9sxdq\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.140529 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-utilities\") pod \"community-operators-9sxdq\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.171148 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr6pp\" (UniqueName: \"kubernetes.io/projected/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-kube-api-access-tr6pp\") pod \"community-operators-9sxdq\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.261723 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:14 crc kubenswrapper[4835]: W1002 12:13:14.820964 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef6f2a3a_33e4_4b4e_89de_8d2a19f420f4.slice/crio-dc745ca846b77d06a17d6f7dc144c44f2fc242e8662b8b3b5c9e9aa5d0a5db0f WatchSource:0}: Error finding container dc745ca846b77d06a17d6f7dc144c44f2fc242e8662b8b3b5c9e9aa5d0a5db0f: Status 404 returned error can't find the container with id dc745ca846b77d06a17d6f7dc144c44f2fc242e8662b8b3b5c9e9aa5d0a5db0f Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.822529 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9sxdq"] Oct 02 12:13:14 crc kubenswrapper[4835]: I1002 12:13:14.906651 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sxdq" event={"ID":"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4","Type":"ContainerStarted","Data":"dc745ca846b77d06a17d6f7dc144c44f2fc242e8662b8b3b5c9e9aa5d0a5db0f"} Oct 02 12:13:15 crc kubenswrapper[4835]: I1002 12:13:15.918060 4835 generic.go:334] "Generic (PLEG): container finished" podID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" containerID="036210925cd29d8de58bfb251dad3463dd515e24646183cbbfe5044a38d3bb2e" exitCode=0 Oct 02 12:13:15 crc kubenswrapper[4835]: I1002 12:13:15.918126 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sxdq" event={"ID":"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4","Type":"ContainerDied","Data":"036210925cd29d8de58bfb251dad3463dd515e24646183cbbfe5044a38d3bb2e"} Oct 02 12:13:15 crc kubenswrapper[4835]: I1002 12:13:15.920606 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:13:16 crc kubenswrapper[4835]: I1002 12:13:16.928128 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sxdq" event={"ID":"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4","Type":"ContainerStarted","Data":"1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952"} Oct 02 12:13:17 crc kubenswrapper[4835]: I1002 12:13:17.941160 4835 generic.go:334] "Generic (PLEG): container finished" podID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" containerID="1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952" exitCode=0 Oct 02 12:13:17 crc kubenswrapper[4835]: I1002 12:13:17.941381 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sxdq" event={"ID":"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4","Type":"ContainerDied","Data":"1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952"} Oct 02 12:13:19 crc kubenswrapper[4835]: I1002 12:13:19.961752 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sxdq" event={"ID":"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4","Type":"ContainerStarted","Data":"6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805"} Oct 02 12:13:19 crc kubenswrapper[4835]: I1002 12:13:19.983864 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9sxdq" podStartSLOduration=3.822285883 podStartE2EDuration="6.983842462s" podCreationTimestamp="2025-10-02 12:13:13 +0000 UTC" firstStartedPulling="2025-10-02 12:13:15.920296643 +0000 UTC m=+4672.480204244" lastFinishedPulling="2025-10-02 12:13:19.081853242 +0000 UTC m=+4675.641760823" observedRunningTime="2025-10-02 12:13:19.979889679 +0000 UTC m=+4676.539797280" watchObservedRunningTime="2025-10-02 12:13:19.983842462 +0000 UTC m=+4676.543750043" Oct 02 12:13:21 crc kubenswrapper[4835]: I1002 12:13:21.253281 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:13:21 crc kubenswrapper[4835]: E1002 12:13:21.254076 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:13:24 crc kubenswrapper[4835]: I1002 12:13:24.263784 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:24 crc kubenswrapper[4835]: I1002 12:13:24.264260 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:24 crc kubenswrapper[4835]: I1002 12:13:24.803142 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:25 crc kubenswrapper[4835]: I1002 12:13:25.052999 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:25 crc kubenswrapper[4835]: I1002 12:13:25.105887 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9sxdq"] Oct 02 12:13:27 crc kubenswrapper[4835]: I1002 12:13:27.020826 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9sxdq" podUID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" containerName="registry-server" containerID="cri-o://6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805" gracePeriod=2 Oct 02 12:13:27 crc kubenswrapper[4835]: I1002 12:13:27.547387 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:27 crc kubenswrapper[4835]: I1002 12:13:27.628376 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr6pp\" (UniqueName: \"kubernetes.io/projected/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-kube-api-access-tr6pp\") pod \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " Oct 02 12:13:27 crc kubenswrapper[4835]: I1002 12:13:27.628627 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-catalog-content\") pod \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " Oct 02 12:13:27 crc kubenswrapper[4835]: I1002 12:13:27.628657 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-utilities\") pod \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\" (UID: \"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4\") " Oct 02 12:13:27 crc kubenswrapper[4835]: I1002 12:13:27.629514 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-utilities" (OuterVolumeSpecName: "utilities") pod "ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" (UID: "ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:13:27 crc kubenswrapper[4835]: I1002 12:13:27.637140 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-kube-api-access-tr6pp" (OuterVolumeSpecName: "kube-api-access-tr6pp") pod "ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" (UID: "ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4"). InnerVolumeSpecName "kube-api-access-tr6pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:13:27 crc kubenswrapper[4835]: I1002 12:13:27.687007 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" (UID: "ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:13:27 crc kubenswrapper[4835]: I1002 12:13:27.732188 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr6pp\" (UniqueName: \"kubernetes.io/projected/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-kube-api-access-tr6pp\") on node \"crc\" DevicePath \"\"" Oct 02 12:13:27 crc kubenswrapper[4835]: I1002 12:13:27.732281 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:13:27 crc kubenswrapper[4835]: I1002 12:13:27.732297 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.030051 4835 generic.go:334] "Generic (PLEG): container finished" podID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" containerID="6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805" exitCode=0 Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.030093 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sxdq" event={"ID":"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4","Type":"ContainerDied","Data":"6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805"} Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.030119 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sxdq" event={"ID":"ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4","Type":"ContainerDied","Data":"dc745ca846b77d06a17d6f7dc144c44f2fc242e8662b8b3b5c9e9aa5d0a5db0f"} Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.030135 4835 scope.go:117] "RemoveContainer" containerID="6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805" Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.030336 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9sxdq" Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.078416 4835 scope.go:117] "RemoveContainer" containerID="1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952" Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.080270 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9sxdq"] Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.088969 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9sxdq"] Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.106193 4835 scope.go:117] "RemoveContainer" containerID="036210925cd29d8de58bfb251dad3463dd515e24646183cbbfe5044a38d3bb2e" Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.149674 4835 scope.go:117] "RemoveContainer" containerID="6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805" Oct 02 12:13:28 crc kubenswrapper[4835]: E1002 12:13:28.150203 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805\": container with ID starting with 6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805 not found: ID does not exist" containerID="6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805" Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.150319 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805"} err="failed to get container status \"6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805\": rpc error: code = NotFound desc = could not find container \"6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805\": container with ID starting with 6659c54cc6ba41d7579aefd547038357bf065a059f277e01b80deeb2224eb805 not found: ID does not exist" Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.150342 4835 scope.go:117] "RemoveContainer" containerID="1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952" Oct 02 12:13:28 crc kubenswrapper[4835]: E1002 12:13:28.150650 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952\": container with ID starting with 1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952 not found: ID does not exist" containerID="1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952" Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.150670 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952"} err="failed to get container status \"1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952\": rpc error: code = NotFound desc = could not find container \"1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952\": container with ID starting with 1be89ccfb2614bada54404629687e1d04a4e9bfec1e4f70ec12eb6fbefab5952 not found: ID does not exist" Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.150683 4835 scope.go:117] "RemoveContainer" containerID="036210925cd29d8de58bfb251dad3463dd515e24646183cbbfe5044a38d3bb2e" Oct 02 12:13:28 crc kubenswrapper[4835]: E1002 12:13:28.150965 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"036210925cd29d8de58bfb251dad3463dd515e24646183cbbfe5044a38d3bb2e\": container with ID starting with 036210925cd29d8de58bfb251dad3463dd515e24646183cbbfe5044a38d3bb2e not found: ID does not exist" containerID="036210925cd29d8de58bfb251dad3463dd515e24646183cbbfe5044a38d3bb2e" Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.151030 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036210925cd29d8de58bfb251dad3463dd515e24646183cbbfe5044a38d3bb2e"} err="failed to get container status \"036210925cd29d8de58bfb251dad3463dd515e24646183cbbfe5044a38d3bb2e\": rpc error: code = NotFound desc = could not find container \"036210925cd29d8de58bfb251dad3463dd515e24646183cbbfe5044a38d3bb2e\": container with ID starting with 036210925cd29d8de58bfb251dad3463dd515e24646183cbbfe5044a38d3bb2e not found: ID does not exist" Oct 02 12:13:28 crc kubenswrapper[4835]: I1002 12:13:28.265048 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" path="/var/lib/kubelet/pods/ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4/volumes" Oct 02 12:13:36 crc kubenswrapper[4835]: I1002 12:13:36.253044 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:13:36 crc kubenswrapper[4835]: E1002 12:13:36.254559 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:13:48 crc kubenswrapper[4835]: I1002 12:13:48.253380 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:13:48 crc kubenswrapper[4835]: E1002 12:13:48.254309 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:14:03 crc kubenswrapper[4835]: I1002 12:14:03.251860 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:14:03 crc kubenswrapper[4835]: E1002 12:14:03.252786 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:14:15 crc kubenswrapper[4835]: I1002 12:14:15.252302 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:14:15 crc kubenswrapper[4835]: E1002 12:14:15.253140 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:14:26 crc kubenswrapper[4835]: I1002 12:14:26.251377 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:14:26 crc kubenswrapper[4835]: E1002 12:14:26.252284 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:14:38 crc kubenswrapper[4835]: I1002 12:14:38.252673 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:14:38 crc kubenswrapper[4835]: E1002 12:14:38.253381 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:14:50 crc kubenswrapper[4835]: I1002 12:14:50.253319 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:14:50 crc kubenswrapper[4835]: E1002 12:14:50.254112 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.152188 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x"] Oct 02 12:15:00 crc kubenswrapper[4835]: E1002 12:15:00.153091 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" containerName="extract-utilities" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.153105 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" containerName="extract-utilities" Oct 02 12:15:00 crc kubenswrapper[4835]: E1002 12:15:00.153123 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" containerName="extract-content" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.153129 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" containerName="extract-content" Oct 02 12:15:00 crc kubenswrapper[4835]: E1002 12:15:00.153145 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.153152 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.153431 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6f2a3a-33e4-4b4e-89de-8d2a19f420f4" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.154134 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.156808 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.157461 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.179642 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x"] Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.308485 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcjst\" (UniqueName: \"kubernetes.io/projected/a29ee970-a145-48fd-acda-4d6c9036f842-kube-api-access-bcjst\") pod \"collect-profiles-29323455-hcx5x\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.308736 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a29ee970-a145-48fd-acda-4d6c9036f842-secret-volume\") pod \"collect-profiles-29323455-hcx5x\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.308811 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a29ee970-a145-48fd-acda-4d6c9036f842-config-volume\") pod \"collect-profiles-29323455-hcx5x\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.410748 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcjst\" (UniqueName: \"kubernetes.io/projected/a29ee970-a145-48fd-acda-4d6c9036f842-kube-api-access-bcjst\") pod \"collect-profiles-29323455-hcx5x\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.411071 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a29ee970-a145-48fd-acda-4d6c9036f842-secret-volume\") pod \"collect-profiles-29323455-hcx5x\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.411197 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a29ee970-a145-48fd-acda-4d6c9036f842-config-volume\") pod \"collect-profiles-29323455-hcx5x\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.412264 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a29ee970-a145-48fd-acda-4d6c9036f842-config-volume\") pod \"collect-profiles-29323455-hcx5x\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.423324 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a29ee970-a145-48fd-acda-4d6c9036f842-secret-volume\") pod \"collect-profiles-29323455-hcx5x\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.427645 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcjst\" (UniqueName: \"kubernetes.io/projected/a29ee970-a145-48fd-acda-4d6c9036f842-kube-api-access-bcjst\") pod \"collect-profiles-29323455-hcx5x\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.482717 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:00 crc kubenswrapper[4835]: I1002 12:15:00.962115 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x"] Oct 02 12:15:01 crc kubenswrapper[4835]: I1002 12:15:01.851166 4835 generic.go:334] "Generic (PLEG): container finished" podID="a29ee970-a145-48fd-acda-4d6c9036f842" containerID="c4bf3f95913e8c8b7413b3b7a35bdd9476a954a0e61a8f637b62a395ecb9d853" exitCode=0 Oct 02 12:15:01 crc kubenswrapper[4835]: I1002 12:15:01.851270 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" event={"ID":"a29ee970-a145-48fd-acda-4d6c9036f842","Type":"ContainerDied","Data":"c4bf3f95913e8c8b7413b3b7a35bdd9476a954a0e61a8f637b62a395ecb9d853"} Oct 02 12:15:01 crc kubenswrapper[4835]: I1002 12:15:01.852648 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" event={"ID":"a29ee970-a145-48fd-acda-4d6c9036f842","Type":"ContainerStarted","Data":"3270e30a9e88bdb50a8e2404207bdf4db1726a7e81c318da2b7728c46531d94f"} Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.241114 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.371936 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcjst\" (UniqueName: \"kubernetes.io/projected/a29ee970-a145-48fd-acda-4d6c9036f842-kube-api-access-bcjst\") pod \"a29ee970-a145-48fd-acda-4d6c9036f842\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.372198 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a29ee970-a145-48fd-acda-4d6c9036f842-secret-volume\") pod \"a29ee970-a145-48fd-acda-4d6c9036f842\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.372292 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a29ee970-a145-48fd-acda-4d6c9036f842-config-volume\") pod \"a29ee970-a145-48fd-acda-4d6c9036f842\" (UID: \"a29ee970-a145-48fd-acda-4d6c9036f842\") " Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.373283 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a29ee970-a145-48fd-acda-4d6c9036f842-config-volume" (OuterVolumeSpecName: "config-volume") pod "a29ee970-a145-48fd-acda-4d6c9036f842" (UID: "a29ee970-a145-48fd-acda-4d6c9036f842"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.380871 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29ee970-a145-48fd-acda-4d6c9036f842-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a29ee970-a145-48fd-acda-4d6c9036f842" (UID: "a29ee970-a145-48fd-acda-4d6c9036f842"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.385195 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29ee970-a145-48fd-acda-4d6c9036f842-kube-api-access-bcjst" (OuterVolumeSpecName: "kube-api-access-bcjst") pod "a29ee970-a145-48fd-acda-4d6c9036f842" (UID: "a29ee970-a145-48fd-acda-4d6c9036f842"). InnerVolumeSpecName "kube-api-access-bcjst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.474658 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcjst\" (UniqueName: \"kubernetes.io/projected/a29ee970-a145-48fd-acda-4d6c9036f842-kube-api-access-bcjst\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.474704 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a29ee970-a145-48fd-acda-4d6c9036f842-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.474715 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a29ee970-a145-48fd-acda-4d6c9036f842-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.870342 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" event={"ID":"a29ee970-a145-48fd-acda-4d6c9036f842","Type":"ContainerDied","Data":"3270e30a9e88bdb50a8e2404207bdf4db1726a7e81c318da2b7728c46531d94f"} Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.870381 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3270e30a9e88bdb50a8e2404207bdf4db1726a7e81c318da2b7728c46531d94f" Oct 02 12:15:03 crc kubenswrapper[4835]: I1002 12:15:03.870403 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-hcx5x" Oct 02 12:15:04 crc kubenswrapper[4835]: I1002 12:15:04.321446 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6"] Oct 02 12:15:04 crc kubenswrapper[4835]: I1002 12:15:04.332936 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-ghnb6"] Oct 02 12:15:05 crc kubenswrapper[4835]: I1002 12:15:05.252559 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:15:05 crc kubenswrapper[4835]: E1002 12:15:05.253118 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:15:06 crc kubenswrapper[4835]: I1002 12:15:06.262587 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ae9ae1d-0774-4232-9e42-75de03313a30" path="/var/lib/kubelet/pods/1ae9ae1d-0774-4232-9e42-75de03313a30/volumes" Oct 02 12:15:19 crc kubenswrapper[4835]: I1002 12:15:19.252378 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:15:19 crc kubenswrapper[4835]: E1002 12:15:19.253287 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:15:32 crc kubenswrapper[4835]: I1002 12:15:32.251581 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:15:32 crc kubenswrapper[4835]: E1002 12:15:32.252417 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.051211 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fcwl4"] Oct 02 12:15:46 crc kubenswrapper[4835]: E1002 12:15:46.052465 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29ee970-a145-48fd-acda-4d6c9036f842" containerName="collect-profiles" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.052488 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29ee970-a145-48fd-acda-4d6c9036f842" containerName="collect-profiles" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.052731 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29ee970-a145-48fd-acda-4d6c9036f842" containerName="collect-profiles" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.057994 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.068822 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcwl4"] Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.152861 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjq9v\" (UniqueName: \"kubernetes.io/projected/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-kube-api-access-tjq9v\") pod \"redhat-operators-fcwl4\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.153331 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-catalog-content\") pod \"redhat-operators-fcwl4\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.153400 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-utilities\") pod \"redhat-operators-fcwl4\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.255483 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjq9v\" (UniqueName: \"kubernetes.io/projected/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-kube-api-access-tjq9v\") pod \"redhat-operators-fcwl4\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.255563 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-catalog-content\") pod \"redhat-operators-fcwl4\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.255605 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-utilities\") pod \"redhat-operators-fcwl4\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.256027 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-catalog-content\") pod \"redhat-operators-fcwl4\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.256038 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-utilities\") pod \"redhat-operators-fcwl4\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.276446 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjq9v\" (UniqueName: \"kubernetes.io/projected/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-kube-api-access-tjq9v\") pod \"redhat-operators-fcwl4\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.381710 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:46 crc kubenswrapper[4835]: I1002 12:15:46.841649 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcwl4"] Oct 02 12:15:47 crc kubenswrapper[4835]: I1002 12:15:47.251574 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:15:47 crc kubenswrapper[4835]: E1002 12:15:47.252190 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:15:47 crc kubenswrapper[4835]: I1002 12:15:47.325866 4835 generic.go:334] "Generic (PLEG): container finished" podID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" containerID="b3822fa49a0d541032a4a449c3c9038349cb4014916093e96ff2a63eea4ce908" exitCode=0 Oct 02 12:15:47 crc kubenswrapper[4835]: I1002 12:15:47.325919 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcwl4" event={"ID":"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed","Type":"ContainerDied","Data":"b3822fa49a0d541032a4a449c3c9038349cb4014916093e96ff2a63eea4ce908"} Oct 02 12:15:47 crc kubenswrapper[4835]: I1002 12:15:47.325950 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcwl4" event={"ID":"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed","Type":"ContainerStarted","Data":"f15e21bd43aa9b60bca3d0e19bcd69ef450796424fe293a850e2466971d19d86"} Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.256626 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-68vls"] Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.259268 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.268624 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-68vls"] Oct 02 12:15:49 crc kubenswrapper[4835]: E1002 12:15:49.284863 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a3010f7_9b0a_4464_9b5b_2e83ac8b47ed.slice/crio-conmon-3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a3010f7_9b0a_4464_9b5b_2e83ac8b47ed.slice/crio-3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2.scope\": RecentStats: unable to find data in memory cache]" Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.314756 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb2mv\" (UniqueName: \"kubernetes.io/projected/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-kube-api-access-qb2mv\") pod \"redhat-marketplace-68vls\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.314908 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-catalog-content\") pod \"redhat-marketplace-68vls\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.314971 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-utilities\") pod \"redhat-marketplace-68vls\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.350553 4835 generic.go:334] "Generic (PLEG): container finished" podID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" containerID="3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2" exitCode=0 Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.350613 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcwl4" event={"ID":"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed","Type":"ContainerDied","Data":"3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2"} Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.415507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-catalog-content\") pod \"redhat-marketplace-68vls\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.415903 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-utilities\") pod \"redhat-marketplace-68vls\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.415982 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb2mv\" (UniqueName: \"kubernetes.io/projected/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-kube-api-access-qb2mv\") pod \"redhat-marketplace-68vls\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.416115 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-catalog-content\") pod \"redhat-marketplace-68vls\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.416336 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-utilities\") pod \"redhat-marketplace-68vls\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.435124 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb2mv\" (UniqueName: \"kubernetes.io/projected/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-kube-api-access-qb2mv\") pod \"redhat-marketplace-68vls\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:49 crc kubenswrapper[4835]: I1002 12:15:49.598262 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:50 crc kubenswrapper[4835]: I1002 12:15:50.069215 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-68vls"] Oct 02 12:15:50 crc kubenswrapper[4835]: I1002 12:15:50.361088 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68vls" event={"ID":"3abe6275-30b4-410d-a8ff-b1fe5fef3b37","Type":"ContainerStarted","Data":"334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79"} Oct 02 12:15:50 crc kubenswrapper[4835]: I1002 12:15:50.361129 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68vls" event={"ID":"3abe6275-30b4-410d-a8ff-b1fe5fef3b37","Type":"ContainerStarted","Data":"23999e6019055409c31a081c17f1ecd65c514ea3f353857bd5154e921faa515c"} Oct 02 12:15:50 crc kubenswrapper[4835]: I1002 12:15:50.365132 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcwl4" event={"ID":"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed","Type":"ContainerStarted","Data":"63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c"} Oct 02 12:15:50 crc kubenswrapper[4835]: I1002 12:15:50.404384 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fcwl4" podStartSLOduration=1.932180463 podStartE2EDuration="4.40435818s" podCreationTimestamp="2025-10-02 12:15:46 +0000 UTC" firstStartedPulling="2025-10-02 12:15:47.328836519 +0000 UTC m=+4823.888744100" lastFinishedPulling="2025-10-02 12:15:49.801014236 +0000 UTC m=+4826.360921817" observedRunningTime="2025-10-02 12:15:50.396929819 +0000 UTC m=+4826.956837400" watchObservedRunningTime="2025-10-02 12:15:50.40435818 +0000 UTC m=+4826.964265761" Oct 02 12:15:51 crc kubenswrapper[4835]: I1002 12:15:51.374489 4835 generic.go:334] "Generic (PLEG): container finished" podID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" containerID="334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79" exitCode=0 Oct 02 12:15:51 crc kubenswrapper[4835]: I1002 12:15:51.374537 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68vls" event={"ID":"3abe6275-30b4-410d-a8ff-b1fe5fef3b37","Type":"ContainerDied","Data":"334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79"} Oct 02 12:15:52 crc kubenswrapper[4835]: I1002 12:15:52.385631 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68vls" event={"ID":"3abe6275-30b4-410d-a8ff-b1fe5fef3b37","Type":"ContainerStarted","Data":"0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89"} Oct 02 12:15:53 crc kubenswrapper[4835]: I1002 12:15:53.395954 4835 generic.go:334] "Generic (PLEG): container finished" podID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" containerID="0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89" exitCode=0 Oct 02 12:15:53 crc kubenswrapper[4835]: I1002 12:15:53.396004 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68vls" event={"ID":"3abe6275-30b4-410d-a8ff-b1fe5fef3b37","Type":"ContainerDied","Data":"0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89"} Oct 02 12:15:54 crc kubenswrapper[4835]: I1002 12:15:54.427138 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68vls" event={"ID":"3abe6275-30b4-410d-a8ff-b1fe5fef3b37","Type":"ContainerStarted","Data":"745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755"} Oct 02 12:15:54 crc kubenswrapper[4835]: I1002 12:15:54.453053 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-68vls" podStartSLOduration=2.872035773 podStartE2EDuration="5.453032206s" podCreationTimestamp="2025-10-02 12:15:49 +0000 UTC" firstStartedPulling="2025-10-02 12:15:51.376335781 +0000 UTC m=+4827.936243362" lastFinishedPulling="2025-10-02 12:15:53.957332214 +0000 UTC m=+4830.517239795" observedRunningTime="2025-10-02 12:15:54.442110405 +0000 UTC m=+4831.002018006" watchObservedRunningTime="2025-10-02 12:15:54.453032206 +0000 UTC m=+4831.012939787" Oct 02 12:15:56 crc kubenswrapper[4835]: I1002 12:15:56.383146 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:56 crc kubenswrapper[4835]: I1002 12:15:56.383616 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:56 crc kubenswrapper[4835]: I1002 12:15:56.445483 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:56 crc kubenswrapper[4835]: I1002 12:15:56.492074 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:57 crc kubenswrapper[4835]: I1002 12:15:57.646204 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcwl4"] Oct 02 12:15:58 crc kubenswrapper[4835]: I1002 12:15:58.459605 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fcwl4" podUID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" containerName="registry-server" containerID="cri-o://63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c" gracePeriod=2 Oct 02 12:15:58 crc kubenswrapper[4835]: I1002 12:15:58.932620 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.106482 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-catalog-content\") pod \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.106557 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-utilities\") pod \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.106801 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjq9v\" (UniqueName: \"kubernetes.io/projected/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-kube-api-access-tjq9v\") pod \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\" (UID: \"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed\") " Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.107391 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-utilities" (OuterVolumeSpecName: "utilities") pod "8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" (UID: "8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.112840 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-kube-api-access-tjq9v" (OuterVolumeSpecName: "kube-api-access-tjq9v") pod "8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" (UID: "8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed"). InnerVolumeSpecName "kube-api-access-tjq9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.209268 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjq9v\" (UniqueName: \"kubernetes.io/projected/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-kube-api-access-tjq9v\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.209418 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.252659 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:15:59 crc kubenswrapper[4835]: E1002 12:15:59.253278 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.468187 4835 generic.go:334] "Generic (PLEG): container finished" podID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" containerID="63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c" exitCode=0 Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.468287 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcwl4" event={"ID":"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed","Type":"ContainerDied","Data":"63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c"} Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.468323 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcwl4" event={"ID":"8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed","Type":"ContainerDied","Data":"f15e21bd43aa9b60bca3d0e19bcd69ef450796424fe293a850e2466971d19d86"} Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.468342 4835 scope.go:117] "RemoveContainer" containerID="63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.468485 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcwl4" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.489955 4835 scope.go:117] "RemoveContainer" containerID="3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.508837 4835 scope.go:117] "RemoveContainer" containerID="b3822fa49a0d541032a4a449c3c9038349cb4014916093e96ff2a63eea4ce908" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.557330 4835 scope.go:117] "RemoveContainer" containerID="63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c" Oct 02 12:15:59 crc kubenswrapper[4835]: E1002 12:15:59.557813 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c\": container with ID starting with 63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c not found: ID does not exist" containerID="63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.557849 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c"} err="failed to get container status \"63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c\": rpc error: code = NotFound desc = could not find container \"63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c\": container with ID starting with 63c34377ec3f127d3a796b2495a607a5fa484067e7f5438b58aa3d964e31d25c not found: ID does not exist" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.557868 4835 scope.go:117] "RemoveContainer" containerID="3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2" Oct 02 12:15:59 crc kubenswrapper[4835]: E1002 12:15:59.558383 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2\": container with ID starting with 3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2 not found: ID does not exist" containerID="3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.558434 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2"} err="failed to get container status \"3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2\": rpc error: code = NotFound desc = could not find container \"3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2\": container with ID starting with 3d638ac7ded5e9d6e9dadb824baae82913694da3099d9b61b6e98601bde284c2 not found: ID does not exist" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.558465 4835 scope.go:117] "RemoveContainer" containerID="b3822fa49a0d541032a4a449c3c9038349cb4014916093e96ff2a63eea4ce908" Oct 02 12:15:59 crc kubenswrapper[4835]: E1002 12:15:59.558796 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3822fa49a0d541032a4a449c3c9038349cb4014916093e96ff2a63eea4ce908\": container with ID starting with b3822fa49a0d541032a4a449c3c9038349cb4014916093e96ff2a63eea4ce908 not found: ID does not exist" containerID="b3822fa49a0d541032a4a449c3c9038349cb4014916093e96ff2a63eea4ce908" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.558820 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3822fa49a0d541032a4a449c3c9038349cb4014916093e96ff2a63eea4ce908"} err="failed to get container status \"b3822fa49a0d541032a4a449c3c9038349cb4014916093e96ff2a63eea4ce908\": rpc error: code = NotFound desc = could not find container \"b3822fa49a0d541032a4a449c3c9038349cb4014916093e96ff2a63eea4ce908\": container with ID starting with b3822fa49a0d541032a4a449c3c9038349cb4014916093e96ff2a63eea4ce908 not found: ID does not exist" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.598748 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.602649 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:15:59 crc kubenswrapper[4835]: I1002 12:15:59.645665 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:16:00 crc kubenswrapper[4835]: I1002 12:16:00.538423 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:16:01 crc kubenswrapper[4835]: I1002 12:16:01.281029 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" (UID: "8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:16:01 crc kubenswrapper[4835]: I1002 12:16:01.351600 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:16:01 crc kubenswrapper[4835]: I1002 12:16:01.610478 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcwl4"] Oct 02 12:16:01 crc kubenswrapper[4835]: I1002 12:16:01.620166 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fcwl4"] Oct 02 12:16:02 crc kubenswrapper[4835]: I1002 12:16:02.039371 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-68vls"] Oct 02 12:16:02 crc kubenswrapper[4835]: I1002 12:16:02.212663 4835 scope.go:117] "RemoveContainer" containerID="c9ef11aedbe4ab190486baa8be722d6c5bfe2abeec4b9933261883dda09d94bb" Oct 02 12:16:02 crc kubenswrapper[4835]: I1002 12:16:02.263287 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" path="/var/lib/kubelet/pods/8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed/volumes" Oct 02 12:16:03 crc kubenswrapper[4835]: I1002 12:16:03.502849 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-68vls" podUID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" containerName="registry-server" containerID="cri-o://745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755" gracePeriod=2 Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.450519 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.511911 4835 generic.go:334] "Generic (PLEG): container finished" podID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" containerID="745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755" exitCode=0 Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.511958 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68vls" event={"ID":"3abe6275-30b4-410d-a8ff-b1fe5fef3b37","Type":"ContainerDied","Data":"745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755"} Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.511984 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68vls" event={"ID":"3abe6275-30b4-410d-a8ff-b1fe5fef3b37","Type":"ContainerDied","Data":"23999e6019055409c31a081c17f1ecd65c514ea3f353857bd5154e921faa515c"} Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.511999 4835 scope.go:117] "RemoveContainer" containerID="745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.512062 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68vls" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.529106 4835 scope.go:117] "RemoveContainer" containerID="0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.546175 4835 scope.go:117] "RemoveContainer" containerID="334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.589688 4835 scope.go:117] "RemoveContainer" containerID="745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755" Oct 02 12:16:04 crc kubenswrapper[4835]: E1002 12:16:04.590297 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755\": container with ID starting with 745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755 not found: ID does not exist" containerID="745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.590345 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755"} err="failed to get container status \"745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755\": rpc error: code = NotFound desc = could not find container \"745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755\": container with ID starting with 745f113c55c265685cbfbe40ff8f19cc24b5f23c66aa4d5cd62562debf1a2755 not found: ID does not exist" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.590403 4835 scope.go:117] "RemoveContainer" containerID="0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89" Oct 02 12:16:04 crc kubenswrapper[4835]: E1002 12:16:04.590907 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89\": container with ID starting with 0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89 not found: ID does not exist" containerID="0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.590987 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89"} err="failed to get container status \"0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89\": rpc error: code = NotFound desc = could not find container \"0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89\": container with ID starting with 0e4243c4499afd7cba2ad0c3713affcf0069ad0c30eaacc082266acd2b367c89 not found: ID does not exist" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.591018 4835 scope.go:117] "RemoveContainer" containerID="334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79" Oct 02 12:16:04 crc kubenswrapper[4835]: E1002 12:16:04.591322 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79\": container with ID starting with 334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79 not found: ID does not exist" containerID="334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.591366 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79"} err="failed to get container status \"334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79\": rpc error: code = NotFound desc = could not find container \"334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79\": container with ID starting with 334ccbdad75b8da30b6e9d155ad680be7857bfd8cd2672018215bbb1dc6c1f79 not found: ID does not exist" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.609668 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-catalog-content\") pod \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.609802 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb2mv\" (UniqueName: \"kubernetes.io/projected/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-kube-api-access-qb2mv\") pod \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.609979 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-utilities\") pod \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\" (UID: \"3abe6275-30b4-410d-a8ff-b1fe5fef3b37\") " Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.610915 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-utilities" (OuterVolumeSpecName: "utilities") pod "3abe6275-30b4-410d-a8ff-b1fe5fef3b37" (UID: "3abe6275-30b4-410d-a8ff-b1fe5fef3b37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.618982 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-kube-api-access-qb2mv" (OuterVolumeSpecName: "kube-api-access-qb2mv") pod "3abe6275-30b4-410d-a8ff-b1fe5fef3b37" (UID: "3abe6275-30b4-410d-a8ff-b1fe5fef3b37"). InnerVolumeSpecName "kube-api-access-qb2mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.622405 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3abe6275-30b4-410d-a8ff-b1fe5fef3b37" (UID: "3abe6275-30b4-410d-a8ff-b1fe5fef3b37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.712658 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.712703 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb2mv\" (UniqueName: \"kubernetes.io/projected/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-kube-api-access-qb2mv\") on node \"crc\" DevicePath \"\"" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.712714 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe6275-30b4-410d-a8ff-b1fe5fef3b37-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.843320 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-68vls"] Oct 02 12:16:04 crc kubenswrapper[4835]: I1002 12:16:04.855766 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-68vls"] Oct 02 12:16:06 crc kubenswrapper[4835]: I1002 12:16:06.263688 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" path="/var/lib/kubelet/pods/3abe6275-30b4-410d-a8ff-b1fe5fef3b37/volumes" Oct 02 12:16:13 crc kubenswrapper[4835]: I1002 12:16:13.252171 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:16:13 crc kubenswrapper[4835]: E1002 12:16:13.252884 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:16:27 crc kubenswrapper[4835]: I1002 12:16:27.251837 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:16:27 crc kubenswrapper[4835]: E1002 12:16:27.252615 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:16:40 crc kubenswrapper[4835]: I1002 12:16:40.251759 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:16:40 crc kubenswrapper[4835]: E1002 12:16:40.252505 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:16:54 crc kubenswrapper[4835]: I1002 12:16:54.258397 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:16:54 crc kubenswrapper[4835]: I1002 12:16:54.944500 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"8db63e229346a83437dd96fd651fc43f648b9d957136005668f3992bb347a1c7"} Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.497280 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l2vc4"] Oct 02 12:16:57 crc kubenswrapper[4835]: E1002 12:16:57.498313 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" containerName="extract-utilities" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.498329 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" containerName="extract-utilities" Oct 02 12:16:57 crc kubenswrapper[4835]: E1002 12:16:57.498350 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" containerName="extract-content" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.498358 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" containerName="extract-content" Oct 02 12:16:57 crc kubenswrapper[4835]: E1002 12:16:57.498373 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" containerName="extract-content" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.498382 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" containerName="extract-content" Oct 02 12:16:57 crc kubenswrapper[4835]: E1002 12:16:57.498403 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" containerName="registry-server" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.498412 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" containerName="registry-server" Oct 02 12:16:57 crc kubenswrapper[4835]: E1002 12:16:57.498430 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" containerName="registry-server" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.498437 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" containerName="registry-server" Oct 02 12:16:57 crc kubenswrapper[4835]: E1002 12:16:57.498461 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" containerName="extract-utilities" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.498469 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" containerName="extract-utilities" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.498699 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3010f7-9b0a-4464-9b5b-2e83ac8b47ed" containerName="registry-server" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.498718 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3abe6275-30b4-410d-a8ff-b1fe5fef3b37" containerName="registry-server" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.500502 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.507612 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l2vc4"] Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.662588 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-catalog-content\") pod \"certified-operators-l2vc4\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.663245 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-utilities\") pod \"certified-operators-l2vc4\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.663651 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qww89\" (UniqueName: \"kubernetes.io/projected/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-kube-api-access-qww89\") pod \"certified-operators-l2vc4\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.766141 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-utilities\") pod \"certified-operators-l2vc4\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.766334 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qww89\" (UniqueName: \"kubernetes.io/projected/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-kube-api-access-qww89\") pod \"certified-operators-l2vc4\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.766451 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-catalog-content\") pod \"certified-operators-l2vc4\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.766794 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-utilities\") pod \"certified-operators-l2vc4\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.766984 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-catalog-content\") pod \"certified-operators-l2vc4\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.801212 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qww89\" (UniqueName: \"kubernetes.io/projected/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-kube-api-access-qww89\") pod \"certified-operators-l2vc4\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:16:57 crc kubenswrapper[4835]: I1002 12:16:57.818939 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:16:58 crc kubenswrapper[4835]: I1002 12:16:58.350828 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l2vc4"] Oct 02 12:16:58 crc kubenswrapper[4835]: I1002 12:16:58.982040 4835 generic.go:334] "Generic (PLEG): container finished" podID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" containerID="e8be32074191508d6f8511f84c594db99f464134ae236d90b2cd73761d05727b" exitCode=0 Oct 02 12:16:58 crc kubenswrapper[4835]: I1002 12:16:58.982106 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2vc4" event={"ID":"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d","Type":"ContainerDied","Data":"e8be32074191508d6f8511f84c594db99f464134ae236d90b2cd73761d05727b"} Oct 02 12:16:58 crc kubenswrapper[4835]: I1002 12:16:58.982391 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2vc4" event={"ID":"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d","Type":"ContainerStarted","Data":"78e5dedb3a50873c42ff6c61ba67f549cc019e6085d89d2e3df8bf99dc52a6bc"} Oct 02 12:17:01 crc kubenswrapper[4835]: I1002 12:17:01.023208 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2vc4" event={"ID":"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d","Type":"ContainerStarted","Data":"79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d"} Oct 02 12:17:02 crc kubenswrapper[4835]: I1002 12:17:02.032452 4835 generic.go:334] "Generic (PLEG): container finished" podID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" containerID="79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d" exitCode=0 Oct 02 12:17:02 crc kubenswrapper[4835]: I1002 12:17:02.032526 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2vc4" event={"ID":"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d","Type":"ContainerDied","Data":"79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d"} Oct 02 12:17:03 crc kubenswrapper[4835]: I1002 12:17:03.044392 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2vc4" event={"ID":"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d","Type":"ContainerStarted","Data":"4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e"} Oct 02 12:17:03 crc kubenswrapper[4835]: I1002 12:17:03.065083 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l2vc4" podStartSLOduration=2.429714708 podStartE2EDuration="6.065064166s" podCreationTimestamp="2025-10-02 12:16:57 +0000 UTC" firstStartedPulling="2025-10-02 12:16:58.983648579 +0000 UTC m=+4895.543556160" lastFinishedPulling="2025-10-02 12:17:02.618998037 +0000 UTC m=+4899.178905618" observedRunningTime="2025-10-02 12:17:03.062006159 +0000 UTC m=+4899.621913740" watchObservedRunningTime="2025-10-02 12:17:03.065064166 +0000 UTC m=+4899.624971747" Oct 02 12:17:07 crc kubenswrapper[4835]: I1002 12:17:07.820523 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:17:07 crc kubenswrapper[4835]: I1002 12:17:07.821145 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:17:07 crc kubenswrapper[4835]: I1002 12:17:07.871268 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:17:08 crc kubenswrapper[4835]: I1002 12:17:08.129184 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:17:08 crc kubenswrapper[4835]: I1002 12:17:08.175992 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l2vc4"] Oct 02 12:17:10 crc kubenswrapper[4835]: I1002 12:17:10.107889 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l2vc4" podUID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" containerName="registry-server" containerID="cri-o://4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e" gracePeriod=2 Oct 02 12:17:10 crc kubenswrapper[4835]: I1002 12:17:10.610798 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:17:10 crc kubenswrapper[4835]: I1002 12:17:10.740884 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qww89\" (UniqueName: \"kubernetes.io/projected/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-kube-api-access-qww89\") pod \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " Oct 02 12:17:10 crc kubenswrapper[4835]: I1002 12:17:10.741044 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-utilities\") pod \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " Oct 02 12:17:10 crc kubenswrapper[4835]: I1002 12:17:10.741086 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-catalog-content\") pod \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\" (UID: \"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d\") " Oct 02 12:17:10 crc kubenswrapper[4835]: I1002 12:17:10.742004 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-utilities" (OuterVolumeSpecName: "utilities") pod "acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" (UID: "acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:17:10 crc kubenswrapper[4835]: I1002 12:17:10.746961 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-kube-api-access-qww89" (OuterVolumeSpecName: "kube-api-access-qww89") pod "acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" (UID: "acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d"). InnerVolumeSpecName "kube-api-access-qww89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:17:10 crc kubenswrapper[4835]: I1002 12:17:10.793894 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" (UID: "acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:17:10 crc kubenswrapper[4835]: I1002 12:17:10.843243 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qww89\" (UniqueName: \"kubernetes.io/projected/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-kube-api-access-qww89\") on node \"crc\" DevicePath \"\"" Oct 02 12:17:10 crc kubenswrapper[4835]: I1002 12:17:10.843326 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:17:10 crc kubenswrapper[4835]: I1002 12:17:10.843342 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.119163 4835 generic.go:334] "Generic (PLEG): container finished" podID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" containerID="4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e" exitCode=0 Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.119547 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2vc4" event={"ID":"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d","Type":"ContainerDied","Data":"4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e"} Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.119584 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2vc4" event={"ID":"acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d","Type":"ContainerDied","Data":"78e5dedb3a50873c42ff6c61ba67f549cc019e6085d89d2e3df8bf99dc52a6bc"} Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.119609 4835 scope.go:117] "RemoveContainer" containerID="4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e" Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.119781 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2vc4" Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.156362 4835 scope.go:117] "RemoveContainer" containerID="79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d" Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.162269 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l2vc4"] Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.172926 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l2vc4"] Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.187763 4835 scope.go:117] "RemoveContainer" containerID="e8be32074191508d6f8511f84c594db99f464134ae236d90b2cd73761d05727b" Oct 02 12:17:11 crc kubenswrapper[4835]: E1002 12:17:11.220146 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacc6d6a5_7dee_4d9d_a1a2_18af99dbe00d.slice/crio-78e5dedb3a50873c42ff6c61ba67f549cc019e6085d89d2e3df8bf99dc52a6bc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacc6d6a5_7dee_4d9d_a1a2_18af99dbe00d.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.224473 4835 scope.go:117] "RemoveContainer" containerID="4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e" Oct 02 12:17:11 crc kubenswrapper[4835]: E1002 12:17:11.225192 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e\": container with ID starting with 4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e not found: ID does not exist" containerID="4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e" Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.225271 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e"} err="failed to get container status \"4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e\": rpc error: code = NotFound desc = could not find container \"4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e\": container with ID starting with 4b9de5e349393d51617b2ab0bb24715ed331024b230be86683bfce2874d97a7e not found: ID does not exist" Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.225300 4835 scope.go:117] "RemoveContainer" containerID="79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d" Oct 02 12:17:11 crc kubenswrapper[4835]: E1002 12:17:11.226297 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d\": container with ID starting with 79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d not found: ID does not exist" containerID="79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d" Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.226342 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d"} err="failed to get container status \"79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d\": rpc error: code = NotFound desc = could not find container \"79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d\": container with ID starting with 79718a700d55963bec6a8c4d7dc316cfd33495af398f8af62310a1be6304689d not found: ID does not exist" Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.226372 4835 scope.go:117] "RemoveContainer" containerID="e8be32074191508d6f8511f84c594db99f464134ae236d90b2cd73761d05727b" Oct 02 12:17:11 crc kubenswrapper[4835]: E1002 12:17:11.226706 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8be32074191508d6f8511f84c594db99f464134ae236d90b2cd73761d05727b\": container with ID starting with e8be32074191508d6f8511f84c594db99f464134ae236d90b2cd73761d05727b not found: ID does not exist" containerID="e8be32074191508d6f8511f84c594db99f464134ae236d90b2cd73761d05727b" Oct 02 12:17:11 crc kubenswrapper[4835]: I1002 12:17:11.226732 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8be32074191508d6f8511f84c594db99f464134ae236d90b2cd73761d05727b"} err="failed to get container status \"e8be32074191508d6f8511f84c594db99f464134ae236d90b2cd73761d05727b\": rpc error: code = NotFound desc = could not find container \"e8be32074191508d6f8511f84c594db99f464134ae236d90b2cd73761d05727b\": container with ID starting with e8be32074191508d6f8511f84c594db99f464134ae236d90b2cd73761d05727b not found: ID does not exist" Oct 02 12:17:12 crc kubenswrapper[4835]: I1002 12:17:12.263412 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" path="/var/lib/kubelet/pods/acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d/volumes" Oct 02 12:19:11 crc kubenswrapper[4835]: I1002 12:19:11.984032 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:19:11 crc kubenswrapper[4835]: I1002 12:19:11.984631 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:19:41 crc kubenswrapper[4835]: I1002 12:19:41.983913 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:19:41 crc kubenswrapper[4835]: I1002 12:19:41.984610 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:20:11 crc kubenswrapper[4835]: I1002 12:20:11.984691 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:20:11 crc kubenswrapper[4835]: I1002 12:20:11.985358 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:20:11 crc kubenswrapper[4835]: I1002 12:20:11.985421 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 12:20:11 crc kubenswrapper[4835]: I1002 12:20:11.986406 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8db63e229346a83437dd96fd651fc43f648b9d957136005668f3992bb347a1c7"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:20:11 crc kubenswrapper[4835]: I1002 12:20:11.986465 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://8db63e229346a83437dd96fd651fc43f648b9d957136005668f3992bb347a1c7" gracePeriod=600 Oct 02 12:20:12 crc kubenswrapper[4835]: I1002 12:20:12.837576 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="8db63e229346a83437dd96fd651fc43f648b9d957136005668f3992bb347a1c7" exitCode=0 Oct 02 12:20:12 crc kubenswrapper[4835]: I1002 12:20:12.837667 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"8db63e229346a83437dd96fd651fc43f648b9d957136005668f3992bb347a1c7"} Oct 02 12:20:12 crc kubenswrapper[4835]: I1002 12:20:12.838195 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a"} Oct 02 12:20:12 crc kubenswrapper[4835]: I1002 12:20:12.838240 4835 scope.go:117] "RemoveContainer" containerID="c2606b92df4bcfeade5caa422cea5ba8b161e41671e229f6e68349169ada8f24" Oct 02 12:22:41 crc kubenswrapper[4835]: I1002 12:22:41.984105 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:22:41 crc kubenswrapper[4835]: I1002 12:22:41.984919 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:23:11 crc kubenswrapper[4835]: I1002 12:23:11.983635 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:23:11 crc kubenswrapper[4835]: I1002 12:23:11.984265 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.392515 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-swgx4"] Oct 02 12:23:26 crc kubenswrapper[4835]: E1002 12:23:26.393576 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" containerName="extract-content" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.393597 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" containerName="extract-content" Oct 02 12:23:26 crc kubenswrapper[4835]: E1002 12:23:26.393612 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" containerName="registry-server" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.393620 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" containerName="registry-server" Oct 02 12:23:26 crc kubenswrapper[4835]: E1002 12:23:26.393650 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" containerName="extract-utilities" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.393658 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" containerName="extract-utilities" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.393979 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="acc6d6a5-7dee-4d9d-a1a2-18af99dbe00d" containerName="registry-server" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.395736 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.401995 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swgx4"] Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.468438 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-utilities\") pod \"community-operators-swgx4\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.468493 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z74fl\" (UniqueName: \"kubernetes.io/projected/66608961-e1bf-46cf-a342-9be790d652c6-kube-api-access-z74fl\") pod \"community-operators-swgx4\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.468687 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-catalog-content\") pod \"community-operators-swgx4\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.570141 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-catalog-content\") pod \"community-operators-swgx4\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.570325 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-utilities\") pod \"community-operators-swgx4\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.570363 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z74fl\" (UniqueName: \"kubernetes.io/projected/66608961-e1bf-46cf-a342-9be790d652c6-kube-api-access-z74fl\") pod \"community-operators-swgx4\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.570668 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-catalog-content\") pod \"community-operators-swgx4\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.570854 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-utilities\") pod \"community-operators-swgx4\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.595161 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z74fl\" (UniqueName: \"kubernetes.io/projected/66608961-e1bf-46cf-a342-9be790d652c6-kube-api-access-z74fl\") pod \"community-operators-swgx4\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:26 crc kubenswrapper[4835]: I1002 12:23:26.719827 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:27 crc kubenswrapper[4835]: I1002 12:23:27.371156 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swgx4"] Oct 02 12:23:27 crc kubenswrapper[4835]: I1002 12:23:27.728754 4835 generic.go:334] "Generic (PLEG): container finished" podID="66608961-e1bf-46cf-a342-9be790d652c6" containerID="b047806d749f7bbc3fd7c943efb8a46b72b03ec73eab5e3646009628078f3d58" exitCode=0 Oct 02 12:23:27 crc kubenswrapper[4835]: I1002 12:23:27.728804 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swgx4" event={"ID":"66608961-e1bf-46cf-a342-9be790d652c6","Type":"ContainerDied","Data":"b047806d749f7bbc3fd7c943efb8a46b72b03ec73eab5e3646009628078f3d58"} Oct 02 12:23:27 crc kubenswrapper[4835]: I1002 12:23:27.728836 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swgx4" event={"ID":"66608961-e1bf-46cf-a342-9be790d652c6","Type":"ContainerStarted","Data":"f6398c92e7d748d2bc84caa04aad8bb92d602b03113a44168718ab2a80f43a56"} Oct 02 12:23:27 crc kubenswrapper[4835]: I1002 12:23:27.731563 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:23:29 crc kubenswrapper[4835]: I1002 12:23:29.749733 4835 generic.go:334] "Generic (PLEG): container finished" podID="66608961-e1bf-46cf-a342-9be790d652c6" containerID="bab033a4a128c4a441101e0386d54fbf34cb9d048b5e8ff865a9b523b26e2534" exitCode=0 Oct 02 12:23:29 crc kubenswrapper[4835]: I1002 12:23:29.750260 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swgx4" event={"ID":"66608961-e1bf-46cf-a342-9be790d652c6","Type":"ContainerDied","Data":"bab033a4a128c4a441101e0386d54fbf34cb9d048b5e8ff865a9b523b26e2534"} Oct 02 12:23:30 crc kubenswrapper[4835]: I1002 12:23:30.763288 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swgx4" event={"ID":"66608961-e1bf-46cf-a342-9be790d652c6","Type":"ContainerStarted","Data":"cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c"} Oct 02 12:23:30 crc kubenswrapper[4835]: I1002 12:23:30.788943 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-swgx4" podStartSLOduration=2.308740802 podStartE2EDuration="4.788922615s" podCreationTimestamp="2025-10-02 12:23:26 +0000 UTC" firstStartedPulling="2025-10-02 12:23:27.731319088 +0000 UTC m=+5284.291226669" lastFinishedPulling="2025-10-02 12:23:30.211500901 +0000 UTC m=+5286.771408482" observedRunningTime="2025-10-02 12:23:30.788515933 +0000 UTC m=+5287.348423524" watchObservedRunningTime="2025-10-02 12:23:30.788922615 +0000 UTC m=+5287.348830216" Oct 02 12:23:36 crc kubenswrapper[4835]: I1002 12:23:36.720613 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:36 crc kubenswrapper[4835]: I1002 12:23:36.721405 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:36 crc kubenswrapper[4835]: I1002 12:23:36.771496 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:36 crc kubenswrapper[4835]: I1002 12:23:36.865640 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:37 crc kubenswrapper[4835]: I1002 12:23:37.005742 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swgx4"] Oct 02 12:23:38 crc kubenswrapper[4835]: I1002 12:23:38.837094 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-swgx4" podUID="66608961-e1bf-46cf-a342-9be790d652c6" containerName="registry-server" containerID="cri-o://cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c" gracePeriod=2 Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.344810 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.439768 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-catalog-content\") pod \"66608961-e1bf-46cf-a342-9be790d652c6\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.440151 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-utilities\") pod \"66608961-e1bf-46cf-a342-9be790d652c6\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.440517 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z74fl\" (UniqueName: \"kubernetes.io/projected/66608961-e1bf-46cf-a342-9be790d652c6-kube-api-access-z74fl\") pod \"66608961-e1bf-46cf-a342-9be790d652c6\" (UID: \"66608961-e1bf-46cf-a342-9be790d652c6\") " Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.445920 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-utilities" (OuterVolumeSpecName: "utilities") pod "66608961-e1bf-46cf-a342-9be790d652c6" (UID: "66608961-e1bf-46cf-a342-9be790d652c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.455688 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66608961-e1bf-46cf-a342-9be790d652c6-kube-api-access-z74fl" (OuterVolumeSpecName: "kube-api-access-z74fl") pod "66608961-e1bf-46cf-a342-9be790d652c6" (UID: "66608961-e1bf-46cf-a342-9be790d652c6"). InnerVolumeSpecName "kube-api-access-z74fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.502057 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66608961-e1bf-46cf-a342-9be790d652c6" (UID: "66608961-e1bf-46cf-a342-9be790d652c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.542958 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.543194 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66608961-e1bf-46cf-a342-9be790d652c6-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.543350 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z74fl\" (UniqueName: \"kubernetes.io/projected/66608961-e1bf-46cf-a342-9be790d652c6-kube-api-access-z74fl\") on node \"crc\" DevicePath \"\"" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.850397 4835 generic.go:334] "Generic (PLEG): container finished" podID="66608961-e1bf-46cf-a342-9be790d652c6" containerID="cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c" exitCode=0 Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.850460 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swgx4" event={"ID":"66608961-e1bf-46cf-a342-9be790d652c6","Type":"ContainerDied","Data":"cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c"} Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.850501 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swgx4" event={"ID":"66608961-e1bf-46cf-a342-9be790d652c6","Type":"ContainerDied","Data":"f6398c92e7d748d2bc84caa04aad8bb92d602b03113a44168718ab2a80f43a56"} Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.850529 4835 scope.go:117] "RemoveContainer" containerID="cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.850707 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swgx4" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.878493 4835 scope.go:117] "RemoveContainer" containerID="bab033a4a128c4a441101e0386d54fbf34cb9d048b5e8ff865a9b523b26e2534" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.896772 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swgx4"] Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.906861 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-swgx4"] Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.925242 4835 scope.go:117] "RemoveContainer" containerID="b047806d749f7bbc3fd7c943efb8a46b72b03ec73eab5e3646009628078f3d58" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.955165 4835 scope.go:117] "RemoveContainer" containerID="cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c" Oct 02 12:23:39 crc kubenswrapper[4835]: E1002 12:23:39.955621 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c\": container with ID starting with cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c not found: ID does not exist" containerID="cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.955655 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c"} err="failed to get container status \"cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c\": rpc error: code = NotFound desc = could not find container \"cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c\": container with ID starting with cd12828dbb30e0a2acb12ffa51584ba602775ea1d5a5e19ac890233d9078b15c not found: ID does not exist" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.955676 4835 scope.go:117] "RemoveContainer" containerID="bab033a4a128c4a441101e0386d54fbf34cb9d048b5e8ff865a9b523b26e2534" Oct 02 12:23:39 crc kubenswrapper[4835]: E1002 12:23:39.955903 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab033a4a128c4a441101e0386d54fbf34cb9d048b5e8ff865a9b523b26e2534\": container with ID starting with bab033a4a128c4a441101e0386d54fbf34cb9d048b5e8ff865a9b523b26e2534 not found: ID does not exist" containerID="bab033a4a128c4a441101e0386d54fbf34cb9d048b5e8ff865a9b523b26e2534" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.955933 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab033a4a128c4a441101e0386d54fbf34cb9d048b5e8ff865a9b523b26e2534"} err="failed to get container status \"bab033a4a128c4a441101e0386d54fbf34cb9d048b5e8ff865a9b523b26e2534\": rpc error: code = NotFound desc = could not find container \"bab033a4a128c4a441101e0386d54fbf34cb9d048b5e8ff865a9b523b26e2534\": container with ID starting with bab033a4a128c4a441101e0386d54fbf34cb9d048b5e8ff865a9b523b26e2534 not found: ID does not exist" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.955948 4835 scope.go:117] "RemoveContainer" containerID="b047806d749f7bbc3fd7c943efb8a46b72b03ec73eab5e3646009628078f3d58" Oct 02 12:23:39 crc kubenswrapper[4835]: E1002 12:23:39.956313 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b047806d749f7bbc3fd7c943efb8a46b72b03ec73eab5e3646009628078f3d58\": container with ID starting with b047806d749f7bbc3fd7c943efb8a46b72b03ec73eab5e3646009628078f3d58 not found: ID does not exist" containerID="b047806d749f7bbc3fd7c943efb8a46b72b03ec73eab5e3646009628078f3d58" Oct 02 12:23:39 crc kubenswrapper[4835]: I1002 12:23:39.956339 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b047806d749f7bbc3fd7c943efb8a46b72b03ec73eab5e3646009628078f3d58"} err="failed to get container status \"b047806d749f7bbc3fd7c943efb8a46b72b03ec73eab5e3646009628078f3d58\": rpc error: code = NotFound desc = could not find container \"b047806d749f7bbc3fd7c943efb8a46b72b03ec73eab5e3646009628078f3d58\": container with ID starting with b047806d749f7bbc3fd7c943efb8a46b72b03ec73eab5e3646009628078f3d58 not found: ID does not exist" Oct 02 12:23:40 crc kubenswrapper[4835]: I1002 12:23:40.265484 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66608961-e1bf-46cf-a342-9be790d652c6" path="/var/lib/kubelet/pods/66608961-e1bf-46cf-a342-9be790d652c6/volumes" Oct 02 12:23:41 crc kubenswrapper[4835]: I1002 12:23:41.983681 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:23:41 crc kubenswrapper[4835]: I1002 12:23:41.984429 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:23:41 crc kubenswrapper[4835]: I1002 12:23:41.984498 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 12:23:41 crc kubenswrapper[4835]: I1002 12:23:41.985632 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:23:41 crc kubenswrapper[4835]: I1002 12:23:41.985737 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" gracePeriod=600 Oct 02 12:23:42 crc kubenswrapper[4835]: E1002 12:23:42.128689 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:23:42 crc kubenswrapper[4835]: I1002 12:23:42.880912 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" exitCode=0 Oct 02 12:23:42 crc kubenswrapper[4835]: I1002 12:23:42.880977 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a"} Oct 02 12:23:42 crc kubenswrapper[4835]: I1002 12:23:42.881276 4835 scope.go:117] "RemoveContainer" containerID="8db63e229346a83437dd96fd651fc43f648b9d957136005668f3992bb347a1c7" Oct 02 12:23:42 crc kubenswrapper[4835]: I1002 12:23:42.882212 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:23:42 crc kubenswrapper[4835]: E1002 12:23:42.882655 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:23:57 crc kubenswrapper[4835]: I1002 12:23:57.252164 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:23:57 crc kubenswrapper[4835]: E1002 12:23:57.252980 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:24:10 crc kubenswrapper[4835]: I1002 12:24:10.252862 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:24:10 crc kubenswrapper[4835]: E1002 12:24:10.253859 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:24:25 crc kubenswrapper[4835]: I1002 12:24:25.251931 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:24:25 crc kubenswrapper[4835]: E1002 12:24:25.253843 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:24:36 crc kubenswrapper[4835]: I1002 12:24:36.252495 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:24:36 crc kubenswrapper[4835]: E1002 12:24:36.253242 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:24:48 crc kubenswrapper[4835]: I1002 12:24:48.252061 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:24:48 crc kubenswrapper[4835]: E1002 12:24:48.252934 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:24:59 crc kubenswrapper[4835]: I1002 12:24:59.252157 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:24:59 crc kubenswrapper[4835]: E1002 12:24:59.252973 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:25:07 crc kubenswrapper[4835]: I1002 12:25:07.723056 4835 generic.go:334] "Generic (PLEG): container finished" podID="5922a15b-856f-45aa-aed9-d8787e4f470f" containerID="9c236c621536713ca9f903b52acc230578e65ecedfc67236d5d73141893a30e6" exitCode=1 Oct 02 12:25:07 crc kubenswrapper[4835]: I1002 12:25:07.723128 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5922a15b-856f-45aa-aed9-d8787e4f470f","Type":"ContainerDied","Data":"9c236c621536713ca9f903b52acc230578e65ecedfc67236d5d73141893a30e6"} Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.106356 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.215115 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-config-data\") pod \"5922a15b-856f-45aa-aed9-d8787e4f470f\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.215206 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whzxd\" (UniqueName: \"kubernetes.io/projected/5922a15b-856f-45aa-aed9-d8787e4f470f-kube-api-access-whzxd\") pod \"5922a15b-856f-45aa-aed9-d8787e4f470f\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.215264 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ca-certs\") pod \"5922a15b-856f-45aa-aed9-d8787e4f470f\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.215296 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config\") pod \"5922a15b-856f-45aa-aed9-d8787e4f470f\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.215368 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-workdir\") pod \"5922a15b-856f-45aa-aed9-d8787e4f470f\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.215415 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ssh-key\") pod \"5922a15b-856f-45aa-aed9-d8787e4f470f\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.215546 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config-secret\") pod \"5922a15b-856f-45aa-aed9-d8787e4f470f\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.215591 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-temporary\") pod \"5922a15b-856f-45aa-aed9-d8787e4f470f\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.215626 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"5922a15b-856f-45aa-aed9-d8787e4f470f\" (UID: \"5922a15b-856f-45aa-aed9-d8787e4f470f\") " Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.216399 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5922a15b-856f-45aa-aed9-d8787e4f470f" (UID: "5922a15b-856f-45aa-aed9-d8787e4f470f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.216660 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-config-data" (OuterVolumeSpecName: "config-data") pod "5922a15b-856f-45aa-aed9-d8787e4f470f" (UID: "5922a15b-856f-45aa-aed9-d8787e4f470f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.220021 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5922a15b-856f-45aa-aed9-d8787e4f470f" (UID: "5922a15b-856f-45aa-aed9-d8787e4f470f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.230730 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5922a15b-856f-45aa-aed9-d8787e4f470f" (UID: "5922a15b-856f-45aa-aed9-d8787e4f470f"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.233409 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5922a15b-856f-45aa-aed9-d8787e4f470f-kube-api-access-whzxd" (OuterVolumeSpecName: "kube-api-access-whzxd") pod "5922a15b-856f-45aa-aed9-d8787e4f470f" (UID: "5922a15b-856f-45aa-aed9-d8787e4f470f"). InnerVolumeSpecName "kube-api-access-whzxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.244128 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5922a15b-856f-45aa-aed9-d8787e4f470f" (UID: "5922a15b-856f-45aa-aed9-d8787e4f470f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.251518 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5922a15b-856f-45aa-aed9-d8787e4f470f" (UID: "5922a15b-856f-45aa-aed9-d8787e4f470f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.263829 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5922a15b-856f-45aa-aed9-d8787e4f470f" (UID: "5922a15b-856f-45aa-aed9-d8787e4f470f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.266860 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5922a15b-856f-45aa-aed9-d8787e4f470f" (UID: "5922a15b-856f-45aa-aed9-d8787e4f470f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.318289 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.318350 4835 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.318411 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.318426 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.318440 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whzxd\" (UniqueName: \"kubernetes.io/projected/5922a15b-856f-45aa-aed9-d8787e4f470f-kube-api-access-whzxd\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.318453 4835 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.318485 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5922a15b-856f-45aa-aed9-d8787e4f470f-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.318496 4835 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5922a15b-856f-45aa-aed9-d8787e4f470f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.318512 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5922a15b-856f-45aa-aed9-d8787e4f470f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.348964 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.420556 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.745755 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5922a15b-856f-45aa-aed9-d8787e4f470f","Type":"ContainerDied","Data":"a913d017a655c1ade9a45bbd6d633b49a76f302b7083b944c6e7c9e02eee428e"} Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.746024 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a913d017a655c1ade9a45bbd6d633b49a76f302b7083b944c6e7c9e02eee428e" Oct 02 12:25:09 crc kubenswrapper[4835]: I1002 12:25:09.745833 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 12:25:10 crc kubenswrapper[4835]: I1002 12:25:10.252027 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:25:10 crc kubenswrapper[4835]: E1002 12:25:10.252612 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.355436 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 12:25:15 crc kubenswrapper[4835]: E1002 12:25:15.356397 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66608961-e1bf-46cf-a342-9be790d652c6" containerName="extract-utilities" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.356414 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="66608961-e1bf-46cf-a342-9be790d652c6" containerName="extract-utilities" Oct 02 12:25:15 crc kubenswrapper[4835]: E1002 12:25:15.356429 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5922a15b-856f-45aa-aed9-d8787e4f470f" containerName="tempest-tests-tempest-tests-runner" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.356435 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5922a15b-856f-45aa-aed9-d8787e4f470f" containerName="tempest-tests-tempest-tests-runner" Oct 02 12:25:15 crc kubenswrapper[4835]: E1002 12:25:15.356443 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66608961-e1bf-46cf-a342-9be790d652c6" containerName="extract-content" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.356451 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="66608961-e1bf-46cf-a342-9be790d652c6" containerName="extract-content" Oct 02 12:25:15 crc kubenswrapper[4835]: E1002 12:25:15.356679 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66608961-e1bf-46cf-a342-9be790d652c6" containerName="registry-server" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.356685 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="66608961-e1bf-46cf-a342-9be790d652c6" containerName="registry-server" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.356902 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5922a15b-856f-45aa-aed9-d8787e4f470f" containerName="tempest-tests-tempest-tests-runner" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.356915 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="66608961-e1bf-46cf-a342-9be790d652c6" containerName="registry-server" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.357561 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.360243 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lrw8k" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.370515 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.441629 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jtw9\" (UniqueName: \"kubernetes.io/projected/f8933811-45f7-4f7e-b7bb-6fe07421852c-kube-api-access-6jtw9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f8933811-45f7-4f7e-b7bb-6fe07421852c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.441710 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f8933811-45f7-4f7e-b7bb-6fe07421852c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.543356 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jtw9\" (UniqueName: \"kubernetes.io/projected/f8933811-45f7-4f7e-b7bb-6fe07421852c-kube-api-access-6jtw9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f8933811-45f7-4f7e-b7bb-6fe07421852c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.543432 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f8933811-45f7-4f7e-b7bb-6fe07421852c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.543961 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f8933811-45f7-4f7e-b7bb-6fe07421852c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.569250 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f8933811-45f7-4f7e-b7bb-6fe07421852c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.580892 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jtw9\" (UniqueName: \"kubernetes.io/projected/f8933811-45f7-4f7e-b7bb-6fe07421852c-kube-api-access-6jtw9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f8933811-45f7-4f7e-b7bb-6fe07421852c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:25:15 crc kubenswrapper[4835]: I1002 12:25:15.692052 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 12:25:16 crc kubenswrapper[4835]: I1002 12:25:16.183997 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 12:25:16 crc kubenswrapper[4835]: I1002 12:25:16.812611 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"f8933811-45f7-4f7e-b7bb-6fe07421852c","Type":"ContainerStarted","Data":"62125df650557ba3ed4b45fd47ecefce98874e92394b7095b9ec96290e876c01"} Oct 02 12:25:18 crc kubenswrapper[4835]: I1002 12:25:18.827544 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"f8933811-45f7-4f7e-b7bb-6fe07421852c","Type":"ContainerStarted","Data":"caf0f9abb689861804f3d249274fe1165b5837924f5a8ccdf4332767033e3221"} Oct 02 12:25:18 crc kubenswrapper[4835]: I1002 12:25:18.848659 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.385541962 podStartE2EDuration="3.848644246s" podCreationTimestamp="2025-10-02 12:25:15 +0000 UTC" firstStartedPulling="2025-10-02 12:25:16.200558139 +0000 UTC m=+5392.760465710" lastFinishedPulling="2025-10-02 12:25:17.663660413 +0000 UTC m=+5394.223567994" observedRunningTime="2025-10-02 12:25:18.841930403 +0000 UTC m=+5395.401837994" watchObservedRunningTime="2025-10-02 12:25:18.848644246 +0000 UTC m=+5395.408551827" Oct 02 12:25:21 crc kubenswrapper[4835]: I1002 12:25:21.251864 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:25:21 crc kubenswrapper[4835]: E1002 12:25:21.252405 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:25:34 crc kubenswrapper[4835]: I1002 12:25:34.258413 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:25:34 crc kubenswrapper[4835]: E1002 12:25:34.259185 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.318607 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnfs2/must-gather-g9662"] Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.320688 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/must-gather-g9662" Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.322713 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nnfs2"/"default-dockercfg-9klmn" Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.323004 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nnfs2"/"openshift-service-ca.crt" Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.323126 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nnfs2"/"kube-root-ca.crt" Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.329940 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnfs2/must-gather-g9662"] Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.417371 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpfhh\" (UniqueName: \"kubernetes.io/projected/9d5a0b00-da61-493c-a758-56d420fe4971-kube-api-access-fpfhh\") pod \"must-gather-g9662\" (UID: \"9d5a0b00-da61-493c-a758-56d420fe4971\") " pod="openshift-must-gather-nnfs2/must-gather-g9662" Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.417565 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d5a0b00-da61-493c-a758-56d420fe4971-must-gather-output\") pod \"must-gather-g9662\" (UID: \"9d5a0b00-da61-493c-a758-56d420fe4971\") " pod="openshift-must-gather-nnfs2/must-gather-g9662" Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.519208 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d5a0b00-da61-493c-a758-56d420fe4971-must-gather-output\") pod \"must-gather-g9662\" (UID: \"9d5a0b00-da61-493c-a758-56d420fe4971\") " pod="openshift-must-gather-nnfs2/must-gather-g9662" Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.519330 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpfhh\" (UniqueName: \"kubernetes.io/projected/9d5a0b00-da61-493c-a758-56d420fe4971-kube-api-access-fpfhh\") pod \"must-gather-g9662\" (UID: \"9d5a0b00-da61-493c-a758-56d420fe4971\") " pod="openshift-must-gather-nnfs2/must-gather-g9662" Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.519774 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d5a0b00-da61-493c-a758-56d420fe4971-must-gather-output\") pod \"must-gather-g9662\" (UID: \"9d5a0b00-da61-493c-a758-56d420fe4971\") " pod="openshift-must-gather-nnfs2/must-gather-g9662" Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.543301 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpfhh\" (UniqueName: \"kubernetes.io/projected/9d5a0b00-da61-493c-a758-56d420fe4971-kube-api-access-fpfhh\") pod \"must-gather-g9662\" (UID: \"9d5a0b00-da61-493c-a758-56d420fe4971\") " pod="openshift-must-gather-nnfs2/must-gather-g9662" Oct 02 12:25:40 crc kubenswrapper[4835]: I1002 12:25:40.662773 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/must-gather-g9662" Oct 02 12:25:41 crc kubenswrapper[4835]: I1002 12:25:41.171430 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnfs2/must-gather-g9662"] Oct 02 12:25:42 crc kubenswrapper[4835]: I1002 12:25:42.053106 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/must-gather-g9662" event={"ID":"9d5a0b00-da61-493c-a758-56d420fe4971","Type":"ContainerStarted","Data":"69672e379bb7adf813e98d2c2e38ebe11c867265e1fade63fff0e176922991d1"} Oct 02 12:25:48 crc kubenswrapper[4835]: I1002 12:25:48.113702 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/must-gather-g9662" event={"ID":"9d5a0b00-da61-493c-a758-56d420fe4971","Type":"ContainerStarted","Data":"9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1"} Oct 02 12:25:48 crc kubenswrapper[4835]: I1002 12:25:48.114283 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/must-gather-g9662" event={"ID":"9d5a0b00-da61-493c-a758-56d420fe4971","Type":"ContainerStarted","Data":"7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b"} Oct 02 12:25:48 crc kubenswrapper[4835]: I1002 12:25:48.135161 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nnfs2/must-gather-g9662" podStartSLOduration=1.910790278 podStartE2EDuration="8.135139021s" podCreationTimestamp="2025-10-02 12:25:40 +0000 UTC" firstStartedPulling="2025-10-02 12:25:41.192077183 +0000 UTC m=+5417.751984764" lastFinishedPulling="2025-10-02 12:25:47.416425926 +0000 UTC m=+5423.976333507" observedRunningTime="2025-10-02 12:25:48.134961296 +0000 UTC m=+5424.694868877" watchObservedRunningTime="2025-10-02 12:25:48.135139021 +0000 UTC m=+5424.695046602" Oct 02 12:25:48 crc kubenswrapper[4835]: I1002 12:25:48.252444 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:25:48 crc kubenswrapper[4835]: E1002 12:25:48.252946 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:25:51 crc kubenswrapper[4835]: I1002 12:25:51.547779 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnfs2/crc-debug-jnqw8"] Oct 02 12:25:51 crc kubenswrapper[4835]: I1002 12:25:51.549864 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" Oct 02 12:25:51 crc kubenswrapper[4835]: I1002 12:25:51.649585 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/890c484e-50e0-4f72-a950-934b03f085b7-host\") pod \"crc-debug-jnqw8\" (UID: \"890c484e-50e0-4f72-a950-934b03f085b7\") " pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" Oct 02 12:25:51 crc kubenswrapper[4835]: I1002 12:25:51.649774 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c67g7\" (UniqueName: \"kubernetes.io/projected/890c484e-50e0-4f72-a950-934b03f085b7-kube-api-access-c67g7\") pod \"crc-debug-jnqw8\" (UID: \"890c484e-50e0-4f72-a950-934b03f085b7\") " pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" Oct 02 12:25:51 crc kubenswrapper[4835]: I1002 12:25:51.752547 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c67g7\" (UniqueName: \"kubernetes.io/projected/890c484e-50e0-4f72-a950-934b03f085b7-kube-api-access-c67g7\") pod \"crc-debug-jnqw8\" (UID: \"890c484e-50e0-4f72-a950-934b03f085b7\") " pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" Oct 02 12:25:51 crc kubenswrapper[4835]: I1002 12:25:51.753124 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/890c484e-50e0-4f72-a950-934b03f085b7-host\") pod \"crc-debug-jnqw8\" (UID: \"890c484e-50e0-4f72-a950-934b03f085b7\") " pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" Oct 02 12:25:51 crc kubenswrapper[4835]: I1002 12:25:51.753300 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/890c484e-50e0-4f72-a950-934b03f085b7-host\") pod \"crc-debug-jnqw8\" (UID: \"890c484e-50e0-4f72-a950-934b03f085b7\") " pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" Oct 02 12:25:51 crc kubenswrapper[4835]: I1002 12:25:51.776287 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c67g7\" (UniqueName: \"kubernetes.io/projected/890c484e-50e0-4f72-a950-934b03f085b7-kube-api-access-c67g7\") pod \"crc-debug-jnqw8\" (UID: \"890c484e-50e0-4f72-a950-934b03f085b7\") " pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" Oct 02 12:25:51 crc kubenswrapper[4835]: I1002 12:25:51.875383 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" Oct 02 12:25:52 crc kubenswrapper[4835]: I1002 12:25:52.146616 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" event={"ID":"890c484e-50e0-4f72-a950-934b03f085b7","Type":"ContainerStarted","Data":"865b4e9cf8692ad96a1beeb724adb9d14bd9fc99a96e5b3a680ac36471011be7"} Oct 02 12:25:53 crc kubenswrapper[4835]: E1002 12:25:53.996492 4835 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.169:52508->38.102.83.169:37707: write tcp 38.102.83.169:52508->38.102.83.169:37707: write: connection reset by peer Oct 02 12:25:59 crc kubenswrapper[4835]: I1002 12:25:59.251732 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:25:59 crc kubenswrapper[4835]: E1002 12:25:59.252639 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:26:04 crc kubenswrapper[4835]: I1002 12:26:04.280747 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" event={"ID":"890c484e-50e0-4f72-a950-934b03f085b7","Type":"ContainerStarted","Data":"d878f12789396ba88162a95016421042b8f371741f260b99ac2c492439f50eff"} Oct 02 12:26:04 crc kubenswrapper[4835]: I1002 12:26:04.303774 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" podStartSLOduration=1.243873974 podStartE2EDuration="13.303754674s" podCreationTimestamp="2025-10-02 12:25:51 +0000 UTC" firstStartedPulling="2025-10-02 12:25:51.930275013 +0000 UTC m=+5428.490182594" lastFinishedPulling="2025-10-02 12:26:03.990155713 +0000 UTC m=+5440.550063294" observedRunningTime="2025-10-02 12:26:04.294399936 +0000 UTC m=+5440.854307517" watchObservedRunningTime="2025-10-02 12:26:04.303754674 +0000 UTC m=+5440.863662255" Oct 02 12:26:12 crc kubenswrapper[4835]: I1002 12:26:12.251986 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:26:12 crc kubenswrapper[4835]: E1002 12:26:12.253334 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:26:16 crc kubenswrapper[4835]: I1002 12:26:16.747883 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9x9jv"] Oct 02 12:26:16 crc kubenswrapper[4835]: I1002 12:26:16.753174 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:16 crc kubenswrapper[4835]: I1002 12:26:16.764285 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9x9jv"] Oct 02 12:26:16 crc kubenswrapper[4835]: I1002 12:26:16.893825 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-catalog-content\") pod \"redhat-operators-9x9jv\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:16 crc kubenswrapper[4835]: I1002 12:26:16.893871 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkjw7\" (UniqueName: \"kubernetes.io/projected/fbe8e92a-7c50-436a-8353-6a4d19026895-kube-api-access-bkjw7\") pod \"redhat-operators-9x9jv\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:16 crc kubenswrapper[4835]: I1002 12:26:16.894053 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-utilities\") pod \"redhat-operators-9x9jv\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:16 crc kubenswrapper[4835]: I1002 12:26:16.997034 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-utilities\") pod \"redhat-operators-9x9jv\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:16 crc kubenswrapper[4835]: I1002 12:26:16.997253 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-catalog-content\") pod \"redhat-operators-9x9jv\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:16 crc kubenswrapper[4835]: I1002 12:26:16.997283 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkjw7\" (UniqueName: \"kubernetes.io/projected/fbe8e92a-7c50-436a-8353-6a4d19026895-kube-api-access-bkjw7\") pod \"redhat-operators-9x9jv\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:16 crc kubenswrapper[4835]: I1002 12:26:16.997682 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-utilities\") pod \"redhat-operators-9x9jv\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:16 crc kubenswrapper[4835]: I1002 12:26:16.997852 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-catalog-content\") pod \"redhat-operators-9x9jv\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:17 crc kubenswrapper[4835]: I1002 12:26:17.026705 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkjw7\" (UniqueName: \"kubernetes.io/projected/fbe8e92a-7c50-436a-8353-6a4d19026895-kube-api-access-bkjw7\") pod \"redhat-operators-9x9jv\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:17 crc kubenswrapper[4835]: I1002 12:26:17.096062 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:17 crc kubenswrapper[4835]: I1002 12:26:17.695382 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9x9jv"] Oct 02 12:26:18 crc kubenswrapper[4835]: I1002 12:26:18.421382 4835 generic.go:334] "Generic (PLEG): container finished" podID="fbe8e92a-7c50-436a-8353-6a4d19026895" containerID="9b2fbaf2a6a54a107db47ba56cea51b6fcf4cc0313e1090d25409979271f0a8e" exitCode=0 Oct 02 12:26:18 crc kubenswrapper[4835]: I1002 12:26:18.421526 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x9jv" event={"ID":"fbe8e92a-7c50-436a-8353-6a4d19026895","Type":"ContainerDied","Data":"9b2fbaf2a6a54a107db47ba56cea51b6fcf4cc0313e1090d25409979271f0a8e"} Oct 02 12:26:18 crc kubenswrapper[4835]: I1002 12:26:18.421903 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x9jv" event={"ID":"fbe8e92a-7c50-436a-8353-6a4d19026895","Type":"ContainerStarted","Data":"815cdcf0a71e615cbe6dbba62e844f5d34181507dda575c1328b62f73be1b9c0"} Oct 02 12:26:20 crc kubenswrapper[4835]: I1002 12:26:20.442638 4835 generic.go:334] "Generic (PLEG): container finished" podID="fbe8e92a-7c50-436a-8353-6a4d19026895" containerID="a0d13e1c621d71b4ac49f741b5046da9b152a68ca948588b2183581ea94e28c0" exitCode=0 Oct 02 12:26:20 crc kubenswrapper[4835]: I1002 12:26:20.442755 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x9jv" event={"ID":"fbe8e92a-7c50-436a-8353-6a4d19026895","Type":"ContainerDied","Data":"a0d13e1c621d71b4ac49f741b5046da9b152a68ca948588b2183581ea94e28c0"} Oct 02 12:26:21 crc kubenswrapper[4835]: I1002 12:26:21.506584 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x9jv" event={"ID":"fbe8e92a-7c50-436a-8353-6a4d19026895","Type":"ContainerStarted","Data":"99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96"} Oct 02 12:26:23 crc kubenswrapper[4835]: I1002 12:26:23.252170 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:26:23 crc kubenswrapper[4835]: E1002 12:26:23.253816 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:26:27 crc kubenswrapper[4835]: I1002 12:26:27.098515 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:27 crc kubenswrapper[4835]: I1002 12:26:27.099417 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:27 crc kubenswrapper[4835]: I1002 12:26:27.168459 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:27 crc kubenswrapper[4835]: I1002 12:26:27.188676 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9x9jv" podStartSLOduration=8.434800904 podStartE2EDuration="11.188656784s" podCreationTimestamp="2025-10-02 12:26:16 +0000 UTC" firstStartedPulling="2025-10-02 12:26:18.423615792 +0000 UTC m=+5454.983523373" lastFinishedPulling="2025-10-02 12:26:21.177471662 +0000 UTC m=+5457.737379253" observedRunningTime="2025-10-02 12:26:21.538812441 +0000 UTC m=+5458.098720022" watchObservedRunningTime="2025-10-02 12:26:27.188656784 +0000 UTC m=+5463.748564365" Oct 02 12:26:27 crc kubenswrapper[4835]: I1002 12:26:27.641151 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:27 crc kubenswrapper[4835]: I1002 12:26:27.687843 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9x9jv"] Oct 02 12:26:29 crc kubenswrapper[4835]: I1002 12:26:29.595097 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9x9jv" podUID="fbe8e92a-7c50-436a-8353-6a4d19026895" containerName="registry-server" containerID="cri-o://99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96" gracePeriod=2 Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.422476 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.504453 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-utilities\") pod \"fbe8e92a-7c50-436a-8353-6a4d19026895\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.504729 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkjw7\" (UniqueName: \"kubernetes.io/projected/fbe8e92a-7c50-436a-8353-6a4d19026895-kube-api-access-bkjw7\") pod \"fbe8e92a-7c50-436a-8353-6a4d19026895\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.504762 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-catalog-content\") pod \"fbe8e92a-7c50-436a-8353-6a4d19026895\" (UID: \"fbe8e92a-7c50-436a-8353-6a4d19026895\") " Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.510372 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-utilities" (OuterVolumeSpecName: "utilities") pod "fbe8e92a-7c50-436a-8353-6a4d19026895" (UID: "fbe8e92a-7c50-436a-8353-6a4d19026895"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.530460 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe8e92a-7c50-436a-8353-6a4d19026895-kube-api-access-bkjw7" (OuterVolumeSpecName: "kube-api-access-bkjw7") pod "fbe8e92a-7c50-436a-8353-6a4d19026895" (UID: "fbe8e92a-7c50-436a-8353-6a4d19026895"). InnerVolumeSpecName "kube-api-access-bkjw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.602209 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbe8e92a-7c50-436a-8353-6a4d19026895" (UID: "fbe8e92a-7c50-436a-8353-6a4d19026895"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.609748 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.610003 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkjw7\" (UniqueName: \"kubernetes.io/projected/fbe8e92a-7c50-436a-8353-6a4d19026895-kube-api-access-bkjw7\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.610100 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe8e92a-7c50-436a-8353-6a4d19026895-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.614740 4835 generic.go:334] "Generic (PLEG): container finished" podID="fbe8e92a-7c50-436a-8353-6a4d19026895" containerID="99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96" exitCode=0 Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.614815 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x9jv" event={"ID":"fbe8e92a-7c50-436a-8353-6a4d19026895","Type":"ContainerDied","Data":"99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96"} Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.615035 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x9jv" event={"ID":"fbe8e92a-7c50-436a-8353-6a4d19026895","Type":"ContainerDied","Data":"815cdcf0a71e615cbe6dbba62e844f5d34181507dda575c1328b62f73be1b9c0"} Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.615133 4835 scope.go:117] "RemoveContainer" containerID="99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.614857 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9x9jv" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.639008 4835 scope.go:117] "RemoveContainer" containerID="a0d13e1c621d71b4ac49f741b5046da9b152a68ca948588b2183581ea94e28c0" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.660249 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9x9jv"] Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.673550 4835 scope.go:117] "RemoveContainer" containerID="9b2fbaf2a6a54a107db47ba56cea51b6fcf4cc0313e1090d25409979271f0a8e" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.678600 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9x9jv"] Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.727850 4835 scope.go:117] "RemoveContainer" containerID="99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96" Oct 02 12:26:30 crc kubenswrapper[4835]: E1002 12:26:30.728309 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96\": container with ID starting with 99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96 not found: ID does not exist" containerID="99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.728362 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96"} err="failed to get container status \"99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96\": rpc error: code = NotFound desc = could not find container \"99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96\": container with ID starting with 99692fe6172b3b0ba6e1a41f8934fc046cc1843330cc1e2903f596ac062c3a96 not found: ID does not exist" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.728402 4835 scope.go:117] "RemoveContainer" containerID="a0d13e1c621d71b4ac49f741b5046da9b152a68ca948588b2183581ea94e28c0" Oct 02 12:26:30 crc kubenswrapper[4835]: E1002 12:26:30.730090 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d13e1c621d71b4ac49f741b5046da9b152a68ca948588b2183581ea94e28c0\": container with ID starting with a0d13e1c621d71b4ac49f741b5046da9b152a68ca948588b2183581ea94e28c0 not found: ID does not exist" containerID="a0d13e1c621d71b4ac49f741b5046da9b152a68ca948588b2183581ea94e28c0" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.730121 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d13e1c621d71b4ac49f741b5046da9b152a68ca948588b2183581ea94e28c0"} err="failed to get container status \"a0d13e1c621d71b4ac49f741b5046da9b152a68ca948588b2183581ea94e28c0\": rpc error: code = NotFound desc = could not find container \"a0d13e1c621d71b4ac49f741b5046da9b152a68ca948588b2183581ea94e28c0\": container with ID starting with a0d13e1c621d71b4ac49f741b5046da9b152a68ca948588b2183581ea94e28c0 not found: ID does not exist" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.730143 4835 scope.go:117] "RemoveContainer" containerID="9b2fbaf2a6a54a107db47ba56cea51b6fcf4cc0313e1090d25409979271f0a8e" Oct 02 12:26:30 crc kubenswrapper[4835]: E1002 12:26:30.731871 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b2fbaf2a6a54a107db47ba56cea51b6fcf4cc0313e1090d25409979271f0a8e\": container with ID starting with 9b2fbaf2a6a54a107db47ba56cea51b6fcf4cc0313e1090d25409979271f0a8e not found: ID does not exist" containerID="9b2fbaf2a6a54a107db47ba56cea51b6fcf4cc0313e1090d25409979271f0a8e" Oct 02 12:26:30 crc kubenswrapper[4835]: I1002 12:26:30.731900 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b2fbaf2a6a54a107db47ba56cea51b6fcf4cc0313e1090d25409979271f0a8e"} err="failed to get container status \"9b2fbaf2a6a54a107db47ba56cea51b6fcf4cc0313e1090d25409979271f0a8e\": rpc error: code = NotFound desc = could not find container \"9b2fbaf2a6a54a107db47ba56cea51b6fcf4cc0313e1090d25409979271f0a8e\": container with ID starting with 9b2fbaf2a6a54a107db47ba56cea51b6fcf4cc0313e1090d25409979271f0a8e not found: ID does not exist" Oct 02 12:26:32 crc kubenswrapper[4835]: I1002 12:26:32.263026 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe8e92a-7c50-436a-8353-6a4d19026895" path="/var/lib/kubelet/pods/fbe8e92a-7c50-436a-8353-6a4d19026895/volumes" Oct 02 12:26:36 crc kubenswrapper[4835]: I1002 12:26:36.252803 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:26:36 crc kubenswrapper[4835]: E1002 12:26:36.253430 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:26:48 crc kubenswrapper[4835]: I1002 12:26:48.252084 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:26:48 crc kubenswrapper[4835]: E1002 12:26:48.252945 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:26:59 crc kubenswrapper[4835]: I1002 12:26:59.252903 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:26:59 crc kubenswrapper[4835]: E1002 12:26:59.253763 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:27:00 crc kubenswrapper[4835]: I1002 12:27:00.831468 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5956b78c54-6g8cs_9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1/barbican-api/0.log" Oct 02 12:27:01 crc kubenswrapper[4835]: I1002 12:27:01.175196 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5956b78c54-6g8cs_9e7da97d-16d8-4f7f-b51f-82cc7c0b61e1/barbican-api-log/0.log" Oct 02 12:27:01 crc kubenswrapper[4835]: I1002 12:27:01.257560 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cfcd5b5b-2xqb4_6557e02c-d16a-4b3b-8d22-00662118e581/barbican-keystone-listener/0.log" Oct 02 12:27:01 crc kubenswrapper[4835]: I1002 12:27:01.510097 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cfcd5b5b-2xqb4_6557e02c-d16a-4b3b-8d22-00662118e581/barbican-keystone-listener-log/0.log" Oct 02 12:27:01 crc kubenswrapper[4835]: I1002 12:27:01.664618 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5547df6bdc-2nktk_6e1683aa-2540-417c-8334-10082451475b/barbican-worker/0.log" Oct 02 12:27:01 crc kubenswrapper[4835]: I1002 12:27:01.697182 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5547df6bdc-2nktk_6e1683aa-2540-417c-8334-10082451475b/barbican-worker-log/0.log" Oct 02 12:27:01 crc kubenswrapper[4835]: I1002 12:27:01.892378 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-wfwn9_a9ec4746-21a6-4eb2-bbe4-b929315a91c5/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:02 crc kubenswrapper[4835]: I1002 12:27:02.114924 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ec2a0f3a-56b8-4866-9740-d6499077797a/ceilometer-central-agent/0.log" Oct 02 12:27:02 crc kubenswrapper[4835]: I1002 12:27:02.338149 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ec2a0f3a-56b8-4866-9740-d6499077797a/ceilometer-notification-agent/0.log" Oct 02 12:27:02 crc kubenswrapper[4835]: I1002 12:27:02.367519 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ec2a0f3a-56b8-4866-9740-d6499077797a/proxy-httpd/0.log" Oct 02 12:27:02 crc kubenswrapper[4835]: I1002 12:27:02.497097 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ec2a0f3a-56b8-4866-9740-d6499077797a/sg-core/0.log" Oct 02 12:27:02 crc kubenswrapper[4835]: I1002 12:27:02.708578 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-7p5px_7e2bad15-a601-45d3-96df-fe89c132053d/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:02 crc kubenswrapper[4835]: I1002 12:27:02.922437 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xp9lk_44c2bc95-3dad-486a-b15d-83158d9619d1/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:03 crc kubenswrapper[4835]: I1002 12:27:03.750821 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_42e4bd48-e5db-4d06-947f-63223788352f/cinder-api/0.log" Oct 02 12:27:04 crc kubenswrapper[4835]: I1002 12:27:04.099745 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_42e4bd48-e5db-4d06-947f-63223788352f/cinder-api-log/0.log" Oct 02 12:27:04 crc kubenswrapper[4835]: I1002 12:27:04.152674 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_5351da7e-bf35-4614-95f5-72fb10c1b920/probe/0.log" Oct 02 12:27:04 crc kubenswrapper[4835]: I1002 12:27:04.439724 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_aaa8b6c2-6884-4f83-bb0b-a866426ca426/cinder-scheduler/0.log" Oct 02 12:27:04 crc kubenswrapper[4835]: I1002 12:27:04.583342 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_aaa8b6c2-6884-4f83-bb0b-a866426ca426/probe/0.log" Oct 02 12:27:05 crc kubenswrapper[4835]: I1002 12:27:05.142071 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_3713b30c-b4df-4bce-912c-f8161b5ab949/probe/0.log" Oct 02 12:27:05 crc kubenswrapper[4835]: I1002 12:27:05.690267 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-844jf_d830ff32-e5f8-46b0-ba9c-988561d11e8c/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:06 crc kubenswrapper[4835]: I1002 12:27:06.126506 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-27j6x_08ff3367-a3dd-419e-a60e-1e4892b68d1c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:06 crc kubenswrapper[4835]: I1002 12:27:06.581553 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-lqghn_913bcc4c-ed8f-4b09-b645-09cbb3e7943a/init/0.log" Oct 02 12:27:06 crc kubenswrapper[4835]: I1002 12:27:06.848947 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-lqghn_913bcc4c-ed8f-4b09-b645-09cbb3e7943a/init/0.log" Oct 02 12:27:07 crc kubenswrapper[4835]: I1002 12:27:07.238093 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-lqghn_913bcc4c-ed8f-4b09-b645-09cbb3e7943a/dnsmasq-dns/0.log" Oct 02 12:27:07 crc kubenswrapper[4835]: I1002 12:27:07.638061 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a9c14cdd-ae95-4489-a154-8b11c8c2ec87/glance-httpd/0.log" Oct 02 12:27:07 crc kubenswrapper[4835]: I1002 12:27:07.660516 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_5351da7e-bf35-4614-95f5-72fb10c1b920/cinder-backup/0.log" Oct 02 12:27:07 crc kubenswrapper[4835]: I1002 12:27:07.892943 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a9c14cdd-ae95-4489-a154-8b11c8c2ec87/glance-log/0.log" Oct 02 12:27:08 crc kubenswrapper[4835]: I1002 12:27:08.083162 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_20d39210-5076-4fca-9c17-dd4b6f18220c/glance-log/0.log" Oct 02 12:27:08 crc kubenswrapper[4835]: I1002 12:27:08.132612 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_20d39210-5076-4fca-9c17-dd4b6f18220c/glance-httpd/0.log" Oct 02 12:27:08 crc kubenswrapper[4835]: I1002 12:27:08.469796 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d655558cb-84687_4fd0f229-d269-4fa9-bd48-0909ce1ce941/horizon/0.log" Oct 02 12:27:08 crc kubenswrapper[4835]: I1002 12:27:08.918663 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nxgdh_6b93c415-6ba0-4183-b6b1-b47166ed39f1/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:08 crc kubenswrapper[4835]: I1002 12:27:08.952559 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d655558cb-84687_4fd0f229-d269-4fa9-bd48-0909ce1ce941/horizon-log/0.log" Oct 02 12:27:09 crc kubenswrapper[4835]: I1002 12:27:09.147630 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-8965g_329e8cb4-3d66-4c42-be6d-9cb71fdb008a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:09 crc kubenswrapper[4835]: I1002 12:27:09.865313 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29323441-b6crb_cfbb9e42-19fe-4ab0-9b02-38adc586df01/keystone-cron/0.log" Oct 02 12:27:10 crc kubenswrapper[4835]: I1002 12:27:10.279316 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7deec852-7067-4dfe-b052-3e385e350a93/kube-state-metrics/0.log" Oct 02 12:27:10 crc kubenswrapper[4835]: I1002 12:27:10.321589 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85675864fd-9krzl_8753b414-73d0-489d-ab44-2a54891ba36b/keystone-api/0.log" Oct 02 12:27:10 crc kubenswrapper[4835]: I1002 12:27:10.597675 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-m6tsz_9ef15b26-b414-4846-b8e6-6846f04d18c5/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:10 crc kubenswrapper[4835]: I1002 12:27:10.951179 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_80b69826-9737-464b-a39c-1b853ed917db/manila-api-log/0.log" Oct 02 12:27:11 crc kubenswrapper[4835]: I1002 12:27:11.015152 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_80b69826-9737-464b-a39c-1b853ed917db/manila-api/0.log" Oct 02 12:27:11 crc kubenswrapper[4835]: I1002 12:27:11.332529 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_36fa1fd3-bc15-4b6a-912f-91bc6942c407/manila-scheduler/0.log" Oct 02 12:27:11 crc kubenswrapper[4835]: I1002 12:27:11.411448 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_36fa1fd3-bc15-4b6a-912f-91bc6942c407/probe/0.log" Oct 02 12:27:11 crc kubenswrapper[4835]: I1002 12:27:11.653584 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8cd67eac-f445-4d9e-b0ef-26de2604c1bd/manila-share/0.log" Oct 02 12:27:11 crc kubenswrapper[4835]: I1002 12:27:11.809674 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8cd67eac-f445-4d9e-b0ef-26de2604c1bd/probe/0.log" Oct 02 12:27:12 crc kubenswrapper[4835]: I1002 12:27:12.666331 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d568cc985-z84bp_e08e86a7-2ef3-48ad-82ab-cffc2007fd24/neutron-api/0.log" Oct 02 12:27:12 crc kubenswrapper[4835]: I1002 12:27:12.975415 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_3713b30c-b4df-4bce-912c-f8161b5ab949/cinder-volume/0.log" Oct 02 12:27:13 crc kubenswrapper[4835]: I1002 12:27:13.083606 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d568cc985-z84bp_e08e86a7-2ef3-48ad-82ab-cffc2007fd24/neutron-httpd/0.log" Oct 02 12:27:13 crc kubenswrapper[4835]: I1002 12:27:13.251820 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:27:13 crc kubenswrapper[4835]: E1002 12:27:13.252488 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:27:13 crc kubenswrapper[4835]: I1002 12:27:13.324996 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qsj42_5665327e-d24d-4cd7-908c-fc1fd10204fb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:14 crc kubenswrapper[4835]: I1002 12:27:14.284641 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2efae1cc-e8ac-43fd-bb26-6e8897e916f8/nova-api-log/0.log" Oct 02 12:27:14 crc kubenswrapper[4835]: I1002 12:27:14.581817 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f17d51a8-a2be-44db-ae3f-92a2110b34be/nova-cell0-conductor-conductor/0.log" Oct 02 12:27:14 crc kubenswrapper[4835]: I1002 12:27:14.771439 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2efae1cc-e8ac-43fd-bb26-6e8897e916f8/nova-api-api/0.log" Oct 02 12:27:14 crc kubenswrapper[4835]: I1002 12:27:14.851810 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_bac27bc0-d222-4588-88b6-d354949459a2/nova-cell1-conductor-conductor/0.log" Oct 02 12:27:15 crc kubenswrapper[4835]: I1002 12:27:15.120778 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_cdee82f4-8419-4cf1-8910-9c5516070f11/nova-cell1-novncproxy-novncproxy/0.log" Oct 02 12:27:15 crc kubenswrapper[4835]: I1002 12:27:15.190991 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s798d_758c6988-399c-4303-a629-876f1234d88e/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:15 crc kubenswrapper[4835]: I1002 12:27:15.493212 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ebbc364a-1b3f-40ab-b214-e6b301ae4c1e/nova-metadata-log/0.log" Oct 02 12:27:15 crc kubenswrapper[4835]: I1002 12:27:15.935835 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4d7698ba-7a87-4210-9d11-8bb99f997178/nova-scheduler-scheduler/0.log" Oct 02 12:27:16 crc kubenswrapper[4835]: I1002 12:27:16.172368 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a1c80fb7-b373-407a-9024-6399def35365/mysql-bootstrap/0.log" Oct 02 12:27:16 crc kubenswrapper[4835]: I1002 12:27:16.344827 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a1c80fb7-b373-407a-9024-6399def35365/mysql-bootstrap/0.log" Oct 02 12:27:16 crc kubenswrapper[4835]: I1002 12:27:16.367322 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a1c80fb7-b373-407a-9024-6399def35365/galera/0.log" Oct 02 12:27:16 crc kubenswrapper[4835]: I1002 12:27:16.626415 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9ed37190-5b25-4c46-a9af-9d2b07322f98/mysql-bootstrap/0.log" Oct 02 12:27:16 crc kubenswrapper[4835]: I1002 12:27:16.818637 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9ed37190-5b25-4c46-a9af-9d2b07322f98/mysql-bootstrap/0.log" Oct 02 12:27:16 crc kubenswrapper[4835]: I1002 12:27:16.846148 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9ed37190-5b25-4c46-a9af-9d2b07322f98/galera/0.log" Oct 02 12:27:17 crc kubenswrapper[4835]: I1002 12:27:17.077425 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_748cf871-b4a4-418b-9c72-c2c21e1f85ad/openstackclient/0.log" Oct 02 12:27:17 crc kubenswrapper[4835]: I1002 12:27:17.319327 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-2tk75_83798c14-4aa1-4530-82eb-fbe0cd6ceaf9/ovn-controller/0.log" Oct 02 12:27:17 crc kubenswrapper[4835]: I1002 12:27:17.589935 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ebbc364a-1b3f-40ab-b214-e6b301ae4c1e/nova-metadata-metadata/0.log" Oct 02 12:27:17 crc kubenswrapper[4835]: I1002 12:27:17.591430 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fhwnp_b5831d90-f0d5-47a7-9ff0-532a176b71d2/openstack-network-exporter/0.log" Oct 02 12:27:17 crc kubenswrapper[4835]: I1002 12:27:17.788207 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4bgdg_d68db426-57b9-479b-92d4-e4661cbd2711/ovsdb-server-init/0.log" Oct 02 12:27:18 crc kubenswrapper[4835]: I1002 12:27:18.063883 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4bgdg_d68db426-57b9-479b-92d4-e4661cbd2711/ovsdb-server/0.log" Oct 02 12:27:18 crc kubenswrapper[4835]: I1002 12:27:18.082365 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4bgdg_d68db426-57b9-479b-92d4-e4661cbd2711/ovsdb-server-init/0.log" Oct 02 12:27:18 crc kubenswrapper[4835]: I1002 12:27:18.102918 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4bgdg_d68db426-57b9-479b-92d4-e4661cbd2711/ovs-vswitchd/0.log" Oct 02 12:27:18 crc kubenswrapper[4835]: I1002 12:27:18.346801 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9cvqw_af586fd6-b857-4897-8b19-4d57315fba61/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:18 crc kubenswrapper[4835]: I1002 12:27:18.535830 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_390fd0ef-bba6-458e-b1e6-121f3f846077/ovn-northd/0.log" Oct 02 12:27:18 crc kubenswrapper[4835]: I1002 12:27:18.569760 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_390fd0ef-bba6-458e-b1e6-121f3f846077/openstack-network-exporter/0.log" Oct 02 12:27:18 crc kubenswrapper[4835]: I1002 12:27:18.768895 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e/openstack-network-exporter/0.log" Oct 02 12:27:18 crc kubenswrapper[4835]: I1002 12:27:18.810599 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c1be9dcf-ac2d-467b-9c8a-2ea18fb8bf0e/ovsdbserver-nb/0.log" Oct 02 12:27:18 crc kubenswrapper[4835]: I1002 12:27:18.944326 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_48eefe46-521f-4c42-8796-ba131eae6a9e/openstack-network-exporter/0.log" Oct 02 12:27:19 crc kubenswrapper[4835]: I1002 12:27:19.059737 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_48eefe46-521f-4c42-8796-ba131eae6a9e/ovsdbserver-sb/0.log" Oct 02 12:27:19 crc kubenswrapper[4835]: I1002 12:27:19.252295 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7cfb46b6c6-x7hzh_f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def/placement-api/0.log" Oct 02 12:27:19 crc kubenswrapper[4835]: I1002 12:27:19.441659 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_74ecc09a-8044-49d6-8c9b-2cbcc56d9612/setup-container/0.log" Oct 02 12:27:19 crc kubenswrapper[4835]: I1002 12:27:19.442860 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7cfb46b6c6-x7hzh_f9c7fcb7-c27c-4f38-840f-a3ac9e5e5def/placement-log/0.log" Oct 02 12:27:19 crc kubenswrapper[4835]: I1002 12:27:19.635665 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_74ecc09a-8044-49d6-8c9b-2cbcc56d9612/setup-container/0.log" Oct 02 12:27:19 crc kubenswrapper[4835]: I1002 12:27:19.774491 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_74ecc09a-8044-49d6-8c9b-2cbcc56d9612/rabbitmq/0.log" Oct 02 12:27:19 crc kubenswrapper[4835]: I1002 12:27:19.836460 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_743bfb49-5459-4911-8eee-4bb313368c21/setup-container/0.log" Oct 02 12:27:20 crc kubenswrapper[4835]: I1002 12:27:20.057384 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_743bfb49-5459-4911-8eee-4bb313368c21/rabbitmq/0.log" Oct 02 12:27:20 crc kubenswrapper[4835]: I1002 12:27:20.059461 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_743bfb49-5459-4911-8eee-4bb313368c21/setup-container/0.log" Oct 02 12:27:20 crc kubenswrapper[4835]: I1002 12:27:20.285582 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-q2bbk_f731fa22-ff01-4d11-9cd5-166d6b2d54fb/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:20 crc kubenswrapper[4835]: I1002 12:27:20.365045 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-crn7p_979ea118-1fa5-4dab-838b-8fbd15307fbc/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:20 crc kubenswrapper[4835]: I1002 12:27:20.552990 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7nvnf_a12bde0e-0dbc-4c10-a40f-89a3f77690d3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:20 crc kubenswrapper[4835]: I1002 12:27:20.715608 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9p8hq_e681c894-43b7-4a03-a987-7f7fb40f0754/ssh-known-hosts-edpm-deployment/0.log" Oct 02 12:27:20 crc kubenswrapper[4835]: I1002 12:27:20.938269 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5922a15b-856f-45aa-aed9-d8787e4f470f/tempest-tests-tempest-tests-runner/0.log" Oct 02 12:27:21 crc kubenswrapper[4835]: I1002 12:27:21.024347 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_f8933811-45f7-4f7e-b7bb-6fe07421852c/test-operator-logs-container/0.log" Oct 02 12:27:21 crc kubenswrapper[4835]: I1002 12:27:21.195201 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-pq2zh_d3a1bc51-7cd2-45b7-bb63-0f7af1f913a8/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 12:27:28 crc kubenswrapper[4835]: I1002 12:27:28.254337 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:27:28 crc kubenswrapper[4835]: E1002 12:27:28.256170 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:27:36 crc kubenswrapper[4835]: I1002 12:27:36.655085 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_fda9ea37-267e-46e9-b3ad-721123c57703/memcached/0.log" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.748548 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-76xgt"] Oct 02 12:27:37 crc kubenswrapper[4835]: E1002 12:27:37.749370 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe8e92a-7c50-436a-8353-6a4d19026895" containerName="registry-server" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.749387 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe8e92a-7c50-436a-8353-6a4d19026895" containerName="registry-server" Oct 02 12:27:37 crc kubenswrapper[4835]: E1002 12:27:37.749427 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe8e92a-7c50-436a-8353-6a4d19026895" containerName="extract-content" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.749437 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe8e92a-7c50-436a-8353-6a4d19026895" containerName="extract-content" Oct 02 12:27:37 crc kubenswrapper[4835]: E1002 12:27:37.749456 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe8e92a-7c50-436a-8353-6a4d19026895" containerName="extract-utilities" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.749464 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe8e92a-7c50-436a-8353-6a4d19026895" containerName="extract-utilities" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.749683 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe8e92a-7c50-436a-8353-6a4d19026895" containerName="registry-server" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.751475 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.762146 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-76xgt"] Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.877498 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e50b1562-cfd0-4038-bb85-53e0c33838fc-catalog-content\") pod \"certified-operators-76xgt\" (UID: \"e50b1562-cfd0-4038-bb85-53e0c33838fc\") " pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.877596 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e50b1562-cfd0-4038-bb85-53e0c33838fc-utilities\") pod \"certified-operators-76xgt\" (UID: \"e50b1562-cfd0-4038-bb85-53e0c33838fc\") " pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.877623 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjnnz\" (UniqueName: \"kubernetes.io/projected/e50b1562-cfd0-4038-bb85-53e0c33838fc-kube-api-access-sjnnz\") pod \"certified-operators-76xgt\" (UID: \"e50b1562-cfd0-4038-bb85-53e0c33838fc\") " pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.979245 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e50b1562-cfd0-4038-bb85-53e0c33838fc-catalog-content\") pod \"certified-operators-76xgt\" (UID: \"e50b1562-cfd0-4038-bb85-53e0c33838fc\") " pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.979337 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e50b1562-cfd0-4038-bb85-53e0c33838fc-utilities\") pod \"certified-operators-76xgt\" (UID: \"e50b1562-cfd0-4038-bb85-53e0c33838fc\") " pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.979362 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjnnz\" (UniqueName: \"kubernetes.io/projected/e50b1562-cfd0-4038-bb85-53e0c33838fc-kube-api-access-sjnnz\") pod \"certified-operators-76xgt\" (UID: \"e50b1562-cfd0-4038-bb85-53e0c33838fc\") " pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.980184 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e50b1562-cfd0-4038-bb85-53e0c33838fc-catalog-content\") pod \"certified-operators-76xgt\" (UID: \"e50b1562-cfd0-4038-bb85-53e0c33838fc\") " pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:37 crc kubenswrapper[4835]: I1002 12:27:37.980424 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e50b1562-cfd0-4038-bb85-53e0c33838fc-utilities\") pod \"certified-operators-76xgt\" (UID: \"e50b1562-cfd0-4038-bb85-53e0c33838fc\") " pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:38 crc kubenswrapper[4835]: I1002 12:27:38.001493 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjnnz\" (UniqueName: \"kubernetes.io/projected/e50b1562-cfd0-4038-bb85-53e0c33838fc-kube-api-access-sjnnz\") pod \"certified-operators-76xgt\" (UID: \"e50b1562-cfd0-4038-bb85-53e0c33838fc\") " pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:38 crc kubenswrapper[4835]: I1002 12:27:38.075829 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:38 crc kubenswrapper[4835]: I1002 12:27:38.685377 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-76xgt"] Oct 02 12:27:39 crc kubenswrapper[4835]: I1002 12:27:39.385501 4835 generic.go:334] "Generic (PLEG): container finished" podID="e50b1562-cfd0-4038-bb85-53e0c33838fc" containerID="b87fa676a55d6b272ea414d16e0bd7b9d863ea501b62bd478613e9168cfc5bdd" exitCode=0 Oct 02 12:27:39 crc kubenswrapper[4835]: I1002 12:27:39.385560 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76xgt" event={"ID":"e50b1562-cfd0-4038-bb85-53e0c33838fc","Type":"ContainerDied","Data":"b87fa676a55d6b272ea414d16e0bd7b9d863ea501b62bd478613e9168cfc5bdd"} Oct 02 12:27:39 crc kubenswrapper[4835]: I1002 12:27:39.385737 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76xgt" event={"ID":"e50b1562-cfd0-4038-bb85-53e0c33838fc","Type":"ContainerStarted","Data":"12b7df13171ee30e318010edbb47e1d64c1892be74ef888e663a2a20ec75cb4d"} Oct 02 12:27:43 crc kubenswrapper[4835]: I1002 12:27:43.252198 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:27:43 crc kubenswrapper[4835]: E1002 12:27:43.253081 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:27:45 crc kubenswrapper[4835]: I1002 12:27:45.446853 4835 generic.go:334] "Generic (PLEG): container finished" podID="e50b1562-cfd0-4038-bb85-53e0c33838fc" containerID="4459b02baa71e123d9f5464b28db81283953387f710f6ab440122d27474aedb8" exitCode=0 Oct 02 12:27:45 crc kubenswrapper[4835]: I1002 12:27:45.447325 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76xgt" event={"ID":"e50b1562-cfd0-4038-bb85-53e0c33838fc","Type":"ContainerDied","Data":"4459b02baa71e123d9f5464b28db81283953387f710f6ab440122d27474aedb8"} Oct 02 12:27:46 crc kubenswrapper[4835]: I1002 12:27:46.459989 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76xgt" event={"ID":"e50b1562-cfd0-4038-bb85-53e0c33838fc","Type":"ContainerStarted","Data":"e02c6864f13d6fad5d44e11456eef582e2b0b5569dab12277c145a8cdf96dda4"} Oct 02 12:27:46 crc kubenswrapper[4835]: I1002 12:27:46.485149 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-76xgt" podStartSLOduration=2.812860014 podStartE2EDuration="9.485127119s" podCreationTimestamp="2025-10-02 12:27:37 +0000 UTC" firstStartedPulling="2025-10-02 12:27:39.387417387 +0000 UTC m=+5535.947324968" lastFinishedPulling="2025-10-02 12:27:46.059684482 +0000 UTC m=+5542.619592073" observedRunningTime="2025-10-02 12:27:46.478434277 +0000 UTC m=+5543.038341868" watchObservedRunningTime="2025-10-02 12:27:46.485127119 +0000 UTC m=+5543.045034710" Oct 02 12:27:48 crc kubenswrapper[4835]: I1002 12:27:48.076208 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:48 crc kubenswrapper[4835]: I1002 12:27:48.076836 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:49 crc kubenswrapper[4835]: I1002 12:27:49.123036 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-76xgt" podUID="e50b1562-cfd0-4038-bb85-53e0c33838fc" containerName="registry-server" probeResult="failure" output=< Oct 02 12:27:49 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Oct 02 12:27:49 crc kubenswrapper[4835]: > Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.135937 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.222510 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-76xgt" Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.253378 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:27:58 crc kubenswrapper[4835]: E1002 12:27:58.253696 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.314813 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-76xgt"] Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.397477 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6dqb8"] Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.397757 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6dqb8" podUID="e36c5663-20a7-467b-a112-9f6a409bde0a" containerName="registry-server" containerID="cri-o://775a493269a22a1ffdea195fdb8546262ad75863982f43ebe4042eadac647b5a" gracePeriod=2 Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.590995 4835 generic.go:334] "Generic (PLEG): container finished" podID="e36c5663-20a7-467b-a112-9f6a409bde0a" containerID="775a493269a22a1ffdea195fdb8546262ad75863982f43ebe4042eadac647b5a" exitCode=0 Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.591094 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dqb8" event={"ID":"e36c5663-20a7-467b-a112-9f6a409bde0a","Type":"ContainerDied","Data":"775a493269a22a1ffdea195fdb8546262ad75863982f43ebe4042eadac647b5a"} Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.869846 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.990990 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-utilities\") pod \"e36c5663-20a7-467b-a112-9f6a409bde0a\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.991054 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz47w\" (UniqueName: \"kubernetes.io/projected/e36c5663-20a7-467b-a112-9f6a409bde0a-kube-api-access-bz47w\") pod \"e36c5663-20a7-467b-a112-9f6a409bde0a\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.991241 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-catalog-content\") pod \"e36c5663-20a7-467b-a112-9f6a409bde0a\" (UID: \"e36c5663-20a7-467b-a112-9f6a409bde0a\") " Oct 02 12:27:58 crc kubenswrapper[4835]: I1002 12:27:58.992590 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-utilities" (OuterVolumeSpecName: "utilities") pod "e36c5663-20a7-467b-a112-9f6a409bde0a" (UID: "e36c5663-20a7-467b-a112-9f6a409bde0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.012447 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e36c5663-20a7-467b-a112-9f6a409bde0a-kube-api-access-bz47w" (OuterVolumeSpecName: "kube-api-access-bz47w") pod "e36c5663-20a7-467b-a112-9f6a409bde0a" (UID: "e36c5663-20a7-467b-a112-9f6a409bde0a"). InnerVolumeSpecName "kube-api-access-bz47w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.079478 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e36c5663-20a7-467b-a112-9f6a409bde0a" (UID: "e36c5663-20a7-467b-a112-9f6a409bde0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.094026 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.094061 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36c5663-20a7-467b-a112-9f6a409bde0a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.094072 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz47w\" (UniqueName: \"kubernetes.io/projected/e36c5663-20a7-467b-a112-9f6a409bde0a-kube-api-access-bz47w\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.604687 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dqb8" event={"ID":"e36c5663-20a7-467b-a112-9f6a409bde0a","Type":"ContainerDied","Data":"25a29a990b5debb9ad9d057fdcafa915cf55a75abfbdf3f283480916e705f07e"} Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.604703 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6dqb8" Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.604773 4835 scope.go:117] "RemoveContainer" containerID="775a493269a22a1ffdea195fdb8546262ad75863982f43ebe4042eadac647b5a" Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.636371 4835 scope.go:117] "RemoveContainer" containerID="b9b0d2c1a0da509b044da5a26c60a870e4f3ca31a05c539024fe5e01cd084bf4" Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.662948 4835 scope.go:117] "RemoveContainer" containerID="8d9761140605e7a8ad46e61d380f11e029e80df5f83ba81570d757d9114fc351" Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.666890 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6dqb8"] Oct 02 12:27:59 crc kubenswrapper[4835]: I1002 12:27:59.681652 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6dqb8"] Oct 02 12:28:00 crc kubenswrapper[4835]: I1002 12:28:00.266440 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e36c5663-20a7-467b-a112-9f6a409bde0a" path="/var/lib/kubelet/pods/e36c5663-20a7-467b-a112-9f6a409bde0a/volumes" Oct 02 12:28:11 crc kubenswrapper[4835]: I1002 12:28:11.251816 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:28:11 crc kubenswrapper[4835]: E1002 12:28:11.252935 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:28:19 crc kubenswrapper[4835]: I1002 12:28:19.806206 4835 generic.go:334] "Generic (PLEG): container finished" podID="890c484e-50e0-4f72-a950-934b03f085b7" containerID="d878f12789396ba88162a95016421042b8f371741f260b99ac2c492439f50eff" exitCode=0 Oct 02 12:28:19 crc kubenswrapper[4835]: I1002 12:28:19.806272 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" event={"ID":"890c484e-50e0-4f72-a950-934b03f085b7","Type":"ContainerDied","Data":"d878f12789396ba88162a95016421042b8f371741f260b99ac2c492439f50eff"} Oct 02 12:28:20 crc kubenswrapper[4835]: I1002 12:28:20.924572 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" Oct 02 12:28:20 crc kubenswrapper[4835]: I1002 12:28:20.956602 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nnfs2/crc-debug-jnqw8"] Oct 02 12:28:20 crc kubenswrapper[4835]: I1002 12:28:20.964572 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nnfs2/crc-debug-jnqw8"] Oct 02 12:28:21 crc kubenswrapper[4835]: I1002 12:28:21.014873 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c67g7\" (UniqueName: \"kubernetes.io/projected/890c484e-50e0-4f72-a950-934b03f085b7-kube-api-access-c67g7\") pod \"890c484e-50e0-4f72-a950-934b03f085b7\" (UID: \"890c484e-50e0-4f72-a950-934b03f085b7\") " Oct 02 12:28:21 crc kubenswrapper[4835]: I1002 12:28:21.015789 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/890c484e-50e0-4f72-a950-934b03f085b7-host\") pod \"890c484e-50e0-4f72-a950-934b03f085b7\" (UID: \"890c484e-50e0-4f72-a950-934b03f085b7\") " Oct 02 12:28:21 crc kubenswrapper[4835]: I1002 12:28:21.015963 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/890c484e-50e0-4f72-a950-934b03f085b7-host" (OuterVolumeSpecName: "host") pod "890c484e-50e0-4f72-a950-934b03f085b7" (UID: "890c484e-50e0-4f72-a950-934b03f085b7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:28:21 crc kubenswrapper[4835]: I1002 12:28:21.016776 4835 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/890c484e-50e0-4f72-a950-934b03f085b7-host\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:21 crc kubenswrapper[4835]: I1002 12:28:21.021850 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890c484e-50e0-4f72-a950-934b03f085b7-kube-api-access-c67g7" (OuterVolumeSpecName: "kube-api-access-c67g7") pod "890c484e-50e0-4f72-a950-934b03f085b7" (UID: "890c484e-50e0-4f72-a950-934b03f085b7"). InnerVolumeSpecName "kube-api-access-c67g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:21 crc kubenswrapper[4835]: I1002 12:28:21.119325 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c67g7\" (UniqueName: \"kubernetes.io/projected/890c484e-50e0-4f72-a950-934b03f085b7-kube-api-access-c67g7\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:21 crc kubenswrapper[4835]: I1002 12:28:21.828624 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="865b4e9cf8692ad96a1beeb724adb9d14bd9fc99a96e5b3a680ac36471011be7" Oct 02 12:28:21 crc kubenswrapper[4835]: I1002 12:28:21.828710 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-jnqw8" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.134019 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnfs2/crc-debug-jqz7r"] Oct 02 12:28:22 crc kubenswrapper[4835]: E1002 12:28:22.134425 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890c484e-50e0-4f72-a950-934b03f085b7" containerName="container-00" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.134438 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="890c484e-50e0-4f72-a950-934b03f085b7" containerName="container-00" Oct 02 12:28:22 crc kubenswrapper[4835]: E1002 12:28:22.134454 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36c5663-20a7-467b-a112-9f6a409bde0a" containerName="extract-utilities" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.134461 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36c5663-20a7-467b-a112-9f6a409bde0a" containerName="extract-utilities" Oct 02 12:28:22 crc kubenswrapper[4835]: E1002 12:28:22.134483 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36c5663-20a7-467b-a112-9f6a409bde0a" containerName="extract-content" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.134490 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36c5663-20a7-467b-a112-9f6a409bde0a" containerName="extract-content" Oct 02 12:28:22 crc kubenswrapper[4835]: E1002 12:28:22.134519 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36c5663-20a7-467b-a112-9f6a409bde0a" containerName="registry-server" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.134526 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36c5663-20a7-467b-a112-9f6a409bde0a" containerName="registry-server" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.134737 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e36c5663-20a7-467b-a112-9f6a409bde0a" containerName="registry-server" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.134752 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="890c484e-50e0-4f72-a950-934b03f085b7" containerName="container-00" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.135445 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.242195 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6dks\" (UniqueName: \"kubernetes.io/projected/93d7a367-c1fa-4903-b008-f96c36dbecd0-kube-api-access-h6dks\") pod \"crc-debug-jqz7r\" (UID: \"93d7a367-c1fa-4903-b008-f96c36dbecd0\") " pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.242525 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93d7a367-c1fa-4903-b008-f96c36dbecd0-host\") pod \"crc-debug-jqz7r\" (UID: \"93d7a367-c1fa-4903-b008-f96c36dbecd0\") " pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.253507 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:28:22 crc kubenswrapper[4835]: E1002 12:28:22.255430 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.265667 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890c484e-50e0-4f72-a950-934b03f085b7" path="/var/lib/kubelet/pods/890c484e-50e0-4f72-a950-934b03f085b7/volumes" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.347378 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6dks\" (UniqueName: \"kubernetes.io/projected/93d7a367-c1fa-4903-b008-f96c36dbecd0-kube-api-access-h6dks\") pod \"crc-debug-jqz7r\" (UID: \"93d7a367-c1fa-4903-b008-f96c36dbecd0\") " pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.347677 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93d7a367-c1fa-4903-b008-f96c36dbecd0-host\") pod \"crc-debug-jqz7r\" (UID: \"93d7a367-c1fa-4903-b008-f96c36dbecd0\") " pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.347888 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93d7a367-c1fa-4903-b008-f96c36dbecd0-host\") pod \"crc-debug-jqz7r\" (UID: \"93d7a367-c1fa-4903-b008-f96c36dbecd0\") " pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.367290 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6dks\" (UniqueName: \"kubernetes.io/projected/93d7a367-c1fa-4903-b008-f96c36dbecd0-kube-api-access-h6dks\") pod \"crc-debug-jqz7r\" (UID: \"93d7a367-c1fa-4903-b008-f96c36dbecd0\") " pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.458119 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.837537 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" event={"ID":"93d7a367-c1fa-4903-b008-f96c36dbecd0","Type":"ContainerStarted","Data":"383bd83b282af54552f016566cb0811506778a20e7d0e8574e9064e612fa4653"} Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.837855 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" event={"ID":"93d7a367-c1fa-4903-b008-f96c36dbecd0","Type":"ContainerStarted","Data":"b5aeee2dec3904c29b8df1bcd275b433d429f772fc945c5da851df541cc0d408"} Oct 02 12:28:22 crc kubenswrapper[4835]: I1002 12:28:22.851200 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" podStartSLOduration=0.851177027 podStartE2EDuration="851.177027ms" podCreationTimestamp="2025-10-02 12:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:22.849445067 +0000 UTC m=+5579.409352668" watchObservedRunningTime="2025-10-02 12:28:22.851177027 +0000 UTC m=+5579.411084608" Oct 02 12:28:23 crc kubenswrapper[4835]: I1002 12:28:23.847329 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" event={"ID":"93d7a367-c1fa-4903-b008-f96c36dbecd0","Type":"ContainerDied","Data":"383bd83b282af54552f016566cb0811506778a20e7d0e8574e9064e612fa4653"} Oct 02 12:28:23 crc kubenswrapper[4835]: I1002 12:28:23.847816 4835 generic.go:334] "Generic (PLEG): container finished" podID="93d7a367-c1fa-4903-b008-f96c36dbecd0" containerID="383bd83b282af54552f016566cb0811506778a20e7d0e8574e9064e612fa4653" exitCode=0 Oct 02 12:28:24 crc kubenswrapper[4835]: I1002 12:28:24.963707 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" Oct 02 12:28:25 crc kubenswrapper[4835]: I1002 12:28:25.107892 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93d7a367-c1fa-4903-b008-f96c36dbecd0-host\") pod \"93d7a367-c1fa-4903-b008-f96c36dbecd0\" (UID: \"93d7a367-c1fa-4903-b008-f96c36dbecd0\") " Oct 02 12:28:25 crc kubenswrapper[4835]: I1002 12:28:25.108022 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93d7a367-c1fa-4903-b008-f96c36dbecd0-host" (OuterVolumeSpecName: "host") pod "93d7a367-c1fa-4903-b008-f96c36dbecd0" (UID: "93d7a367-c1fa-4903-b008-f96c36dbecd0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:28:25 crc kubenswrapper[4835]: I1002 12:28:25.108260 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6dks\" (UniqueName: \"kubernetes.io/projected/93d7a367-c1fa-4903-b008-f96c36dbecd0-kube-api-access-h6dks\") pod \"93d7a367-c1fa-4903-b008-f96c36dbecd0\" (UID: \"93d7a367-c1fa-4903-b008-f96c36dbecd0\") " Oct 02 12:28:25 crc kubenswrapper[4835]: I1002 12:28:25.108781 4835 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93d7a367-c1fa-4903-b008-f96c36dbecd0-host\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:25 crc kubenswrapper[4835]: I1002 12:28:25.113073 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d7a367-c1fa-4903-b008-f96c36dbecd0-kube-api-access-h6dks" (OuterVolumeSpecName: "kube-api-access-h6dks") pod "93d7a367-c1fa-4903-b008-f96c36dbecd0" (UID: "93d7a367-c1fa-4903-b008-f96c36dbecd0"). InnerVolumeSpecName "kube-api-access-h6dks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:25 crc kubenswrapper[4835]: I1002 12:28:25.210698 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6dks\" (UniqueName: \"kubernetes.io/projected/93d7a367-c1fa-4903-b008-f96c36dbecd0-kube-api-access-h6dks\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:25 crc kubenswrapper[4835]: I1002 12:28:25.868057 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" event={"ID":"93d7a367-c1fa-4903-b008-f96c36dbecd0","Type":"ContainerDied","Data":"b5aeee2dec3904c29b8df1bcd275b433d429f772fc945c5da851df541cc0d408"} Oct 02 12:28:25 crc kubenswrapper[4835]: I1002 12:28:25.868377 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5aeee2dec3904c29b8df1bcd275b433d429f772fc945c5da851df541cc0d408" Oct 02 12:28:25 crc kubenswrapper[4835]: I1002 12:28:25.868444 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-jqz7r" Oct 02 12:28:33 crc kubenswrapper[4835]: I1002 12:28:33.140200 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nnfs2/crc-debug-jqz7r"] Oct 02 12:28:33 crc kubenswrapper[4835]: I1002 12:28:33.147970 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nnfs2/crc-debug-jqz7r"] Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.264894 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d7a367-c1fa-4903-b008-f96c36dbecd0" path="/var/lib/kubelet/pods/93d7a367-c1fa-4903-b008-f96c36dbecd0/volumes" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.319883 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnfs2/crc-debug-m5z6m"] Oct 02 12:28:34 crc kubenswrapper[4835]: E1002 12:28:34.320383 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d7a367-c1fa-4903-b008-f96c36dbecd0" containerName="container-00" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.320414 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d7a367-c1fa-4903-b008-f96c36dbecd0" containerName="container-00" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.320703 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d7a367-c1fa-4903-b008-f96c36dbecd0" containerName="container-00" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.321472 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.489450 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-host\") pod \"crc-debug-m5z6m\" (UID: \"4c4f2ad2-efee-4ffb-8a74-1af0a20234da\") " pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.490021 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r25h\" (UniqueName: \"kubernetes.io/projected/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-kube-api-access-4r25h\") pod \"crc-debug-m5z6m\" (UID: \"4c4f2ad2-efee-4ffb-8a74-1af0a20234da\") " pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.591889 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r25h\" (UniqueName: \"kubernetes.io/projected/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-kube-api-access-4r25h\") pod \"crc-debug-m5z6m\" (UID: \"4c4f2ad2-efee-4ffb-8a74-1af0a20234da\") " pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.592401 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-host\") pod \"crc-debug-m5z6m\" (UID: \"4c4f2ad2-efee-4ffb-8a74-1af0a20234da\") " pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.592548 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-host\") pod \"crc-debug-m5z6m\" (UID: \"4c4f2ad2-efee-4ffb-8a74-1af0a20234da\") " pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.626774 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r25h\" (UniqueName: \"kubernetes.io/projected/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-kube-api-access-4r25h\") pod \"crc-debug-m5z6m\" (UID: \"4c4f2ad2-efee-4ffb-8a74-1af0a20234da\") " pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.642157 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.942346 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" event={"ID":"4c4f2ad2-efee-4ffb-8a74-1af0a20234da","Type":"ContainerStarted","Data":"b5a20e80978dfb2b99034b10df50f60ebe6c69bcb1fe6d55045aa6336931fb8b"} Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.942948 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" event={"ID":"4c4f2ad2-efee-4ffb-8a74-1af0a20234da","Type":"ContainerStarted","Data":"e5e1719f0fe1cdc95b56458195060fadc4ee86df753991660cad0ee1080195a0"} Oct 02 12:28:34 crc kubenswrapper[4835]: I1002 12:28:34.956103 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" podStartSLOduration=0.956086589 podStartE2EDuration="956.086589ms" podCreationTimestamp="2025-10-02 12:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:34.954646677 +0000 UTC m=+5591.514554258" watchObservedRunningTime="2025-10-02 12:28:34.956086589 +0000 UTC m=+5591.515994170" Oct 02 12:28:35 crc kubenswrapper[4835]: I1002 12:28:35.252388 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:28:35 crc kubenswrapper[4835]: E1002 12:28:35.253490 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5ckb9_openshift-machine-config-operator(ce0ad186-63b7-432a-a0ca-4d4cbde057a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" Oct 02 12:28:35 crc kubenswrapper[4835]: I1002 12:28:35.954994 4835 generic.go:334] "Generic (PLEG): container finished" podID="4c4f2ad2-efee-4ffb-8a74-1af0a20234da" containerID="b5a20e80978dfb2b99034b10df50f60ebe6c69bcb1fe6d55045aa6336931fb8b" exitCode=0 Oct 02 12:28:35 crc kubenswrapper[4835]: I1002 12:28:35.955063 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" event={"ID":"4c4f2ad2-efee-4ffb-8a74-1af0a20234da","Type":"ContainerDied","Data":"b5a20e80978dfb2b99034b10df50f60ebe6c69bcb1fe6d55045aa6336931fb8b"} Oct 02 12:28:37 crc kubenswrapper[4835]: I1002 12:28:37.104803 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" Oct 02 12:28:37 crc kubenswrapper[4835]: I1002 12:28:37.141181 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nnfs2/crc-debug-m5z6m"] Oct 02 12:28:37 crc kubenswrapper[4835]: I1002 12:28:37.150802 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nnfs2/crc-debug-m5z6m"] Oct 02 12:28:37 crc kubenswrapper[4835]: I1002 12:28:37.244628 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r25h\" (UniqueName: \"kubernetes.io/projected/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-kube-api-access-4r25h\") pod \"4c4f2ad2-efee-4ffb-8a74-1af0a20234da\" (UID: \"4c4f2ad2-efee-4ffb-8a74-1af0a20234da\") " Oct 02 12:28:37 crc kubenswrapper[4835]: I1002 12:28:37.244701 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-host\") pod \"4c4f2ad2-efee-4ffb-8a74-1af0a20234da\" (UID: \"4c4f2ad2-efee-4ffb-8a74-1af0a20234da\") " Oct 02 12:28:37 crc kubenswrapper[4835]: I1002 12:28:37.244845 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-host" (OuterVolumeSpecName: "host") pod "4c4f2ad2-efee-4ffb-8a74-1af0a20234da" (UID: "4c4f2ad2-efee-4ffb-8a74-1af0a20234da"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:28:37 crc kubenswrapper[4835]: I1002 12:28:37.245161 4835 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-host\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:37 crc kubenswrapper[4835]: I1002 12:28:37.261625 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-kube-api-access-4r25h" (OuterVolumeSpecName: "kube-api-access-4r25h") pod "4c4f2ad2-efee-4ffb-8a74-1af0a20234da" (UID: "4c4f2ad2-efee-4ffb-8a74-1af0a20234da"). InnerVolumeSpecName "kube-api-access-4r25h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:37 crc kubenswrapper[4835]: I1002 12:28:37.346901 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r25h\" (UniqueName: \"kubernetes.io/projected/4c4f2ad2-efee-4ffb-8a74-1af0a20234da-kube-api-access-4r25h\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:37 crc kubenswrapper[4835]: I1002 12:28:37.977907 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5e1719f0fe1cdc95b56458195060fadc4ee86df753991660cad0ee1080195a0" Oct 02 12:28:37 crc kubenswrapper[4835]: I1002 12:28:37.977973 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/crc-debug-m5z6m" Oct 02 12:28:38 crc kubenswrapper[4835]: I1002 12:28:38.263891 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c4f2ad2-efee-4ffb-8a74-1af0a20234da" path="/var/lib/kubelet/pods/4c4f2ad2-efee-4ffb-8a74-1af0a20234da/volumes" Oct 02 12:28:38 crc kubenswrapper[4835]: I1002 12:28:38.616454 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp_f93ea9d4-aae6-4d12-aef2-7ecf558b4fef/util/0.log" Oct 02 12:28:38 crc kubenswrapper[4835]: I1002 12:28:38.774975 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp_f93ea9d4-aae6-4d12-aef2-7ecf558b4fef/util/0.log" Oct 02 12:28:38 crc kubenswrapper[4835]: I1002 12:28:38.802964 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp_f93ea9d4-aae6-4d12-aef2-7ecf558b4fef/pull/0.log" Oct 02 12:28:38 crc kubenswrapper[4835]: I1002 12:28:38.831055 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp_f93ea9d4-aae6-4d12-aef2-7ecf558b4fef/pull/0.log" Oct 02 12:28:38 crc kubenswrapper[4835]: I1002 12:28:38.981014 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp_f93ea9d4-aae6-4d12-aef2-7ecf558b4fef/util/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.032083 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp_f93ea9d4-aae6-4d12-aef2-7ecf558b4fef/pull/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.040054 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6eda356edce4fb822714a05b699bf39611b7fd3f92ae6b58080824dfc194vrp_f93ea9d4-aae6-4d12-aef2-7ecf558b4fef/extract/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.153765 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-n7dnn_0805fa88-ea1a-4dec-b686-1024df504971/kube-rbac-proxy/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.267194 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6f6c6946b9-gqb82_e0c4310c-242a-4a50-b5b3-6b1705d8ce4d/kube-rbac-proxy/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.267364 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-n7dnn_0805fa88-ea1a-4dec-b686-1024df504971/manager/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.469492 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6f6c6946b9-gqb82_e0c4310c-242a-4a50-b5b3-6b1705d8ce4d/manager/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.491585 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-wfqz7_bb6fcd5f-03e8-4d83-bfeb-6e91b2852548/kube-rbac-proxy/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.494447 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-wfqz7_bb6fcd5f-03e8-4d83-bfeb-6e91b2852548/manager/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.639387 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-bwkr6_3c8953f0-3559-496e-a893-76a065eea629/kube-rbac-proxy/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.774597 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-bwkr6_3c8953f0-3559-496e-a893-76a065eea629/manager/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.866622 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-lgp76_f600efc9-bc85-4462-901d-10cb6ec3113c/kube-rbac-proxy/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.899210 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-lgp76_f600efc9-bc85-4462-901d-10cb6ec3113c/manager/0.log" Oct 02 12:28:39 crc kubenswrapper[4835]: I1002 12:28:39.978694 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-6ggrv_d574af81-6939-4076-8194-049c15ffb305/kube-rbac-proxy/0.log" Oct 02 12:28:40 crc kubenswrapper[4835]: I1002 12:28:40.087168 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-6ggrv_d574af81-6939-4076-8194-049c15ffb305/manager/0.log" Oct 02 12:28:40 crc kubenswrapper[4835]: I1002 12:28:40.140145 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-2bldp_971d8eb9-c70a-45b2-a7c3-20e6b62bbd48/kube-rbac-proxy/0.log" Oct 02 12:28:40 crc kubenswrapper[4835]: I1002 12:28:40.353137 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-fqn5z_3267ccbe-611d-45a1-86fd-b901c6b52373/kube-rbac-proxy/0.log" Oct 02 12:28:40 crc kubenswrapper[4835]: I1002 12:28:40.379638 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-2bldp_971d8eb9-c70a-45b2-a7c3-20e6b62bbd48/manager/0.log" Oct 02 12:28:40 crc kubenswrapper[4835]: I1002 12:28:40.385073 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-fqn5z_3267ccbe-611d-45a1-86fd-b901c6b52373/manager/0.log" Oct 02 12:28:40 crc kubenswrapper[4835]: I1002 12:28:40.553527 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-z4qtt_53591aeb-91e2-4d05-b596-6b9d5b7dcd3f/kube-rbac-proxy/0.log" Oct 02 12:28:40 crc kubenswrapper[4835]: I1002 12:28:40.619231 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-z4qtt_53591aeb-91e2-4d05-b596-6b9d5b7dcd3f/manager/0.log" Oct 02 12:28:40 crc kubenswrapper[4835]: I1002 12:28:40.700651 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-s85jn_574e5f9c-8e3f-4c9e-b562-ff40fa42ac3e/kube-rbac-proxy/0.log" Oct 02 12:28:40 crc kubenswrapper[4835]: I1002 12:28:40.772433 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-s85jn_574e5f9c-8e3f-4c9e-b562-ff40fa42ac3e/manager/0.log" Oct 02 12:28:40 crc kubenswrapper[4835]: I1002 12:28:40.824437 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-sqbrr_b8bda3e4-db9e-4d2c-a352-71f1cde3536b/kube-rbac-proxy/0.log" Oct 02 12:28:40 crc kubenswrapper[4835]: I1002 12:28:40.938150 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-sqbrr_b8bda3e4-db9e-4d2c-a352-71f1cde3536b/manager/0.log" Oct 02 12:28:41 crc kubenswrapper[4835]: I1002 12:28:41.024256 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-srxgs_d6b4f40c-4be2-445b-ab93-583917fb3d1a/kube-rbac-proxy/0.log" Oct 02 12:28:41 crc kubenswrapper[4835]: I1002 12:28:41.070530 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-srxgs_d6b4f40c-4be2-445b-ab93-583917fb3d1a/manager/0.log" Oct 02 12:28:41 crc kubenswrapper[4835]: I1002 12:28:41.191247 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-xpmxm_365587d7-12ca-4826-90c7-b56fac3ac05b/kube-rbac-proxy/0.log" Oct 02 12:28:41 crc kubenswrapper[4835]: I1002 12:28:41.317362 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-xpmxm_365587d7-12ca-4826-90c7-b56fac3ac05b/manager/0.log" Oct 02 12:28:41 crc kubenswrapper[4835]: I1002 12:28:41.383998 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-zc9tx_8b8872b0-5d5a-4934-b298-33b61782bd55/manager/0.log" Oct 02 12:28:41 crc kubenswrapper[4835]: I1002 12:28:41.392813 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-zc9tx_8b8872b0-5d5a-4934-b298-33b61782bd55/kube-rbac-proxy/0.log" Oct 02 12:28:41 crc kubenswrapper[4835]: I1002 12:28:41.527456 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5869cb545-btrqt_36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f/kube-rbac-proxy/0.log" Oct 02 12:28:41 crc kubenswrapper[4835]: I1002 12:28:41.556492 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5869cb545-btrqt_36b9fcb3-e1e2-4a47-a1a9-dc7702cab83f/manager/0.log" Oct 02 12:28:41 crc kubenswrapper[4835]: I1002 12:28:41.704777 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b66d9b9d9-k6n7b_b04cc853-44aa-4377-9c5d-339b1bbd0a78/kube-rbac-proxy/0.log" Oct 02 12:28:41 crc kubenswrapper[4835]: I1002 12:28:41.823412 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-db7694d5f-lbfr7_937e16a3-fce3-4b5f-8969-69995e59a465/kube-rbac-proxy/0.log" Oct 02 12:28:42 crc kubenswrapper[4835]: I1002 12:28:42.046289 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-vf2mn_2d8f70e8-7a4e-4ff1-90a1-d53d0fdc3bd9/registry-server/0.log" Oct 02 12:28:42 crc kubenswrapper[4835]: I1002 12:28:42.080813 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-db7694d5f-lbfr7_937e16a3-fce3-4b5f-8969-69995e59a465/operator/0.log" Oct 02 12:28:42 crc kubenswrapper[4835]: I1002 12:28:42.287771 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-2l6rr_593a4aeb-9c94-487c-bc8c-f234545762d6/kube-rbac-proxy/0.log" Oct 02 12:28:42 crc kubenswrapper[4835]: I1002 12:28:42.348138 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-2l6rr_593a4aeb-9c94-487c-bc8c-f234545762d6/manager/0.log" Oct 02 12:28:42 crc kubenswrapper[4835]: I1002 12:28:42.552680 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-qrdqd_33869fcf-8635-4c66-8364-4fb107c8930e/kube-rbac-proxy/0.log" Oct 02 12:28:42 crc kubenswrapper[4835]: I1002 12:28:42.574154 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-qrdqd_33869fcf-8635-4c66-8364-4fb107c8930e/manager/0.log" Oct 02 12:28:42 crc kubenswrapper[4835]: I1002 12:28:42.661429 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-w7xcv_1293e5be-4d9a-40ca-81b8-576f674acd7c/operator/0.log" Oct 02 12:28:42 crc kubenswrapper[4835]: I1002 12:28:42.797878 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-42v6d_d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf/kube-rbac-proxy/0.log" Oct 02 12:28:42 crc kubenswrapper[4835]: I1002 12:28:42.847139 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-42v6d_d1bc5dfc-8f9d-40fd-bfcb-3c0fb0e415bf/manager/0.log" Oct 02 12:28:43 crc kubenswrapper[4835]: I1002 12:28:43.012185 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-x5w98_ce0e3802-3a95-41a0-91cf-6584596b44ec/kube-rbac-proxy/0.log" Oct 02 12:28:43 crc kubenswrapper[4835]: I1002 12:28:43.066135 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b66d9b9d9-k6n7b_b04cc853-44aa-4377-9c5d-339b1bbd0a78/manager/0.log" Oct 02 12:28:43 crc kubenswrapper[4835]: I1002 12:28:43.100442 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-x5w98_ce0e3802-3a95-41a0-91cf-6584596b44ec/manager/0.log" Oct 02 12:28:43 crc kubenswrapper[4835]: I1002 12:28:43.140561 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-kzmtv_efd33f31-5093-4354-aa38-e40279007a57/kube-rbac-proxy/0.log" Oct 02 12:28:43 crc kubenswrapper[4835]: I1002 12:28:43.198341 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-kzmtv_efd33f31-5093-4354-aa38-e40279007a57/manager/0.log" Oct 02 12:28:43 crc kubenswrapper[4835]: I1002 12:28:43.300526 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-4nlhw_598297ec-cf43-432d-b7b9-67cc5c52ee46/kube-rbac-proxy/0.log" Oct 02 12:28:43 crc kubenswrapper[4835]: I1002 12:28:43.337257 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-4nlhw_598297ec-cf43-432d-b7b9-67cc5c52ee46/manager/0.log" Oct 02 12:28:46 crc kubenswrapper[4835]: I1002 12:28:46.994962 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rxbn8"] Oct 02 12:28:46 crc kubenswrapper[4835]: E1002 12:28:46.996158 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4f2ad2-efee-4ffb-8a74-1af0a20234da" containerName="container-00" Oct 02 12:28:46 crc kubenswrapper[4835]: I1002 12:28:46.996179 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4f2ad2-efee-4ffb-8a74-1af0a20234da" containerName="container-00" Oct 02 12:28:46 crc kubenswrapper[4835]: I1002 12:28:46.996490 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4f2ad2-efee-4ffb-8a74-1af0a20234da" containerName="container-00" Oct 02 12:28:46 crc kubenswrapper[4835]: I1002 12:28:46.998572 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.007299 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rxbn8"] Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.135638 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-catalog-content\") pod \"redhat-marketplace-rxbn8\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.135759 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-utilities\") pod \"redhat-marketplace-rxbn8\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.135806 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vbd2\" (UniqueName: \"kubernetes.io/projected/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-kube-api-access-6vbd2\") pod \"redhat-marketplace-rxbn8\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.238046 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-catalog-content\") pod \"redhat-marketplace-rxbn8\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.238171 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-utilities\") pod \"redhat-marketplace-rxbn8\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.238218 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbd2\" (UniqueName: \"kubernetes.io/projected/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-kube-api-access-6vbd2\") pod \"redhat-marketplace-rxbn8\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.238965 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-utilities\") pod \"redhat-marketplace-rxbn8\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.239022 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-catalog-content\") pod \"redhat-marketplace-rxbn8\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.262663 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbd2\" (UniqueName: \"kubernetes.io/projected/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-kube-api-access-6vbd2\") pod \"redhat-marketplace-rxbn8\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.352690 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:47 crc kubenswrapper[4835]: I1002 12:28:47.855586 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rxbn8"] Oct 02 12:28:48 crc kubenswrapper[4835]: I1002 12:28:48.092571 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxbn8" event={"ID":"fdf5ad25-dce9-4ead-b67f-785b499e5b9d","Type":"ContainerStarted","Data":"e0b63b6fc8a0b93c91b8a44857d6e9e5d2b84c6a82851ed331a7284690a7f2a0"} Oct 02 12:28:49 crc kubenswrapper[4835]: I1002 12:28:49.101874 4835 generic.go:334] "Generic (PLEG): container finished" podID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" containerID="c3da52ce9109c92b7cb1c88b5fa3a9075b6428b31df3800a7feae4753891ada9" exitCode=0 Oct 02 12:28:49 crc kubenswrapper[4835]: I1002 12:28:49.101976 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxbn8" event={"ID":"fdf5ad25-dce9-4ead-b67f-785b499e5b9d","Type":"ContainerDied","Data":"c3da52ce9109c92b7cb1c88b5fa3a9075b6428b31df3800a7feae4753891ada9"} Oct 02 12:28:49 crc kubenswrapper[4835]: I1002 12:28:49.103942 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:28:50 crc kubenswrapper[4835]: I1002 12:28:50.112317 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxbn8" event={"ID":"fdf5ad25-dce9-4ead-b67f-785b499e5b9d","Type":"ContainerStarted","Data":"124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98"} Oct 02 12:28:50 crc kubenswrapper[4835]: I1002 12:28:50.252159 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:28:51 crc kubenswrapper[4835]: I1002 12:28:51.127360 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"fd5656421c8d6ecb2c1780bcbf4ce2378ab8835bd3a3320e051dd565eb10b4b8"} Oct 02 12:28:51 crc kubenswrapper[4835]: I1002 12:28:51.131262 4835 generic.go:334] "Generic (PLEG): container finished" podID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" containerID="124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98" exitCode=0 Oct 02 12:28:51 crc kubenswrapper[4835]: I1002 12:28:51.131306 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxbn8" event={"ID":"fdf5ad25-dce9-4ead-b67f-785b499e5b9d","Type":"ContainerDied","Data":"124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98"} Oct 02 12:28:53 crc kubenswrapper[4835]: I1002 12:28:53.151562 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxbn8" event={"ID":"fdf5ad25-dce9-4ead-b67f-785b499e5b9d","Type":"ContainerStarted","Data":"6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc"} Oct 02 12:28:53 crc kubenswrapper[4835]: I1002 12:28:53.173554 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rxbn8" podStartSLOduration=4.331169293 podStartE2EDuration="7.173532859s" podCreationTimestamp="2025-10-02 12:28:46 +0000 UTC" firstStartedPulling="2025-10-02 12:28:49.103710853 +0000 UTC m=+5605.663618434" lastFinishedPulling="2025-10-02 12:28:51.946074429 +0000 UTC m=+5608.505982000" observedRunningTime="2025-10-02 12:28:53.167520567 +0000 UTC m=+5609.727428208" watchObservedRunningTime="2025-10-02 12:28:53.173532859 +0000 UTC m=+5609.733440440" Oct 02 12:28:57 crc kubenswrapper[4835]: I1002 12:28:57.353298 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:57 crc kubenswrapper[4835]: I1002 12:28:57.353817 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:57 crc kubenswrapper[4835]: I1002 12:28:57.421054 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:58 crc kubenswrapper[4835]: I1002 12:28:58.242886 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:28:58 crc kubenswrapper[4835]: I1002 12:28:58.291040 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rxbn8"] Oct 02 12:28:59 crc kubenswrapper[4835]: I1002 12:28:59.577534 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-44wvb_bcd814c3-6b27-4cd7-a315-0dec8015d04f/control-plane-machine-set-operator/0.log" Oct 02 12:28:59 crc kubenswrapper[4835]: I1002 12:28:59.746815 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zshpt_26659604-60d4-488b-a38b-52fedf8d098d/kube-rbac-proxy/0.log" Oct 02 12:28:59 crc kubenswrapper[4835]: I1002 12:28:59.794168 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zshpt_26659604-60d4-488b-a38b-52fedf8d098d/machine-api-operator/0.log" Oct 02 12:29:00 crc kubenswrapper[4835]: I1002 12:29:00.227346 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rxbn8" podUID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" containerName="registry-server" containerID="cri-o://6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc" gracePeriod=2 Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.222939 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.239827 4835 generic.go:334] "Generic (PLEG): container finished" podID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" containerID="6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc" exitCode=0 Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.240038 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxbn8" event={"ID":"fdf5ad25-dce9-4ead-b67f-785b499e5b9d","Type":"ContainerDied","Data":"6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc"} Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.240145 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rxbn8" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.240289 4835 scope.go:117] "RemoveContainer" containerID="6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.240154 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxbn8" event={"ID":"fdf5ad25-dce9-4ead-b67f-785b499e5b9d","Type":"ContainerDied","Data":"e0b63b6fc8a0b93c91b8a44857d6e9e5d2b84c6a82851ed331a7284690a7f2a0"} Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.275791 4835 scope.go:117] "RemoveContainer" containerID="124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.307484 4835 scope.go:117] "RemoveContainer" containerID="c3da52ce9109c92b7cb1c88b5fa3a9075b6428b31df3800a7feae4753891ada9" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.323268 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-catalog-content\") pod \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.323428 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-utilities\") pod \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.323483 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vbd2\" (UniqueName: \"kubernetes.io/projected/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-kube-api-access-6vbd2\") pod \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\" (UID: \"fdf5ad25-dce9-4ead-b67f-785b499e5b9d\") " Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.324821 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-utilities" (OuterVolumeSpecName: "utilities") pod "fdf5ad25-dce9-4ead-b67f-785b499e5b9d" (UID: "fdf5ad25-dce9-4ead-b67f-785b499e5b9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.332040 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-kube-api-access-6vbd2" (OuterVolumeSpecName: "kube-api-access-6vbd2") pod "fdf5ad25-dce9-4ead-b67f-785b499e5b9d" (UID: "fdf5ad25-dce9-4ead-b67f-785b499e5b9d"). InnerVolumeSpecName "kube-api-access-6vbd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.343019 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdf5ad25-dce9-4ead-b67f-785b499e5b9d" (UID: "fdf5ad25-dce9-4ead-b67f-785b499e5b9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.344097 4835 scope.go:117] "RemoveContainer" containerID="6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc" Oct 02 12:29:01 crc kubenswrapper[4835]: E1002 12:29:01.345964 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc\": container with ID starting with 6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc not found: ID does not exist" containerID="6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.346049 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc"} err="failed to get container status \"6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc\": rpc error: code = NotFound desc = could not find container \"6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc\": container with ID starting with 6c0fb9161437b146f7254031c9be3ace2d3271f5af8c13fb4e68dd1c7837b5dc not found: ID does not exist" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.346101 4835 scope.go:117] "RemoveContainer" containerID="124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98" Oct 02 12:29:01 crc kubenswrapper[4835]: E1002 12:29:01.347215 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98\": container with ID starting with 124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98 not found: ID does not exist" containerID="124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.347380 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98"} err="failed to get container status \"124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98\": rpc error: code = NotFound desc = could not find container \"124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98\": container with ID starting with 124fa04813a8b56c90d8a22927f9e83db460489795d91776cbda30721de0dc98 not found: ID does not exist" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.347494 4835 scope.go:117] "RemoveContainer" containerID="c3da52ce9109c92b7cb1c88b5fa3a9075b6428b31df3800a7feae4753891ada9" Oct 02 12:29:01 crc kubenswrapper[4835]: E1002 12:29:01.348035 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3da52ce9109c92b7cb1c88b5fa3a9075b6428b31df3800a7feae4753891ada9\": container with ID starting with c3da52ce9109c92b7cb1c88b5fa3a9075b6428b31df3800a7feae4753891ada9 not found: ID does not exist" containerID="c3da52ce9109c92b7cb1c88b5fa3a9075b6428b31df3800a7feae4753891ada9" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.348164 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3da52ce9109c92b7cb1c88b5fa3a9075b6428b31df3800a7feae4753891ada9"} err="failed to get container status \"c3da52ce9109c92b7cb1c88b5fa3a9075b6428b31df3800a7feae4753891ada9\": rpc error: code = NotFound desc = could not find container \"c3da52ce9109c92b7cb1c88b5fa3a9075b6428b31df3800a7feae4753891ada9\": container with ID starting with c3da52ce9109c92b7cb1c88b5fa3a9075b6428b31df3800a7feae4753891ada9 not found: ID does not exist" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.427633 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.428121 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vbd2\" (UniqueName: \"kubernetes.io/projected/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-kube-api-access-6vbd2\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.428310 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf5ad25-dce9-4ead-b67f-785b499e5b9d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.576649 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rxbn8"] Oct 02 12:29:01 crc kubenswrapper[4835]: I1002 12:29:01.585369 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rxbn8"] Oct 02 12:29:02 crc kubenswrapper[4835]: I1002 12:29:02.266772 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" path="/var/lib/kubelet/pods/fdf5ad25-dce9-4ead-b67f-785b499e5b9d/volumes" Oct 02 12:29:11 crc kubenswrapper[4835]: I1002 12:29:11.056879 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-k2pjv_d962c289-233d-4788-a761-50ff86c59da8/cert-manager-controller/0.log" Oct 02 12:29:11 crc kubenswrapper[4835]: I1002 12:29:11.206535 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-zd9n5_818c5e73-52cb-47f0-84b4-1931dd17f6e8/cert-manager-cainjector/0.log" Oct 02 12:29:11 crc kubenswrapper[4835]: I1002 12:29:11.286724 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-lppfz_dc8fd6c7-4ae9-49bf-b9ae-6e50882f670b/cert-manager-webhook/0.log" Oct 02 12:29:22 crc kubenswrapper[4835]: I1002 12:29:22.379596 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-5skfl_5a76ca7e-9356-424e-82ed-d184275f398e/nmstate-console-plugin/0.log" Oct 02 12:29:22 crc kubenswrapper[4835]: I1002 12:29:22.492710 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-tpjx2_2322eea6-e397-4f14-ab1c-960128541d48/nmstate-handler/0.log" Oct 02 12:29:22 crc kubenswrapper[4835]: I1002 12:29:22.523611 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-r88ts_d5a0e56a-5f7e-4342-9571-3edcc1135a44/kube-rbac-proxy/0.log" Oct 02 12:29:22 crc kubenswrapper[4835]: I1002 12:29:22.552800 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-r88ts_d5a0e56a-5f7e-4342-9571-3edcc1135a44/nmstate-metrics/0.log" Oct 02 12:29:22 crc kubenswrapper[4835]: I1002 12:29:22.775578 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-nrg4w_6d38f1a8-cd4c-4e77-905b-0480f95167d6/nmstate-operator/0.log" Oct 02 12:29:22 crc kubenswrapper[4835]: I1002 12:29:22.783657 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-qgcrx_b384ef09-a91b-41f1-9cc5-35146749d375/nmstate-webhook/0.log" Oct 02 12:29:36 crc kubenswrapper[4835]: I1002 12:29:36.184863 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-gmhcv_f54912d4-2f6d-4c0d-947a-7f89783a8708/kube-rbac-proxy/0.log" Oct 02 12:29:36 crc kubenswrapper[4835]: I1002 12:29:36.303260 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-gmhcv_f54912d4-2f6d-4c0d-947a-7f89783a8708/controller/0.log" Oct 02 12:29:36 crc kubenswrapper[4835]: I1002 12:29:36.388930 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-frr-files/0.log" Oct 02 12:29:36 crc kubenswrapper[4835]: I1002 12:29:36.609451 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-frr-files/0.log" Oct 02 12:29:36 crc kubenswrapper[4835]: I1002 12:29:36.610482 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-metrics/0.log" Oct 02 12:29:36 crc kubenswrapper[4835]: I1002 12:29:36.611644 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-reloader/0.log" Oct 02 12:29:36 crc kubenswrapper[4835]: I1002 12:29:36.614261 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-reloader/0.log" Oct 02 12:29:36 crc kubenswrapper[4835]: I1002 12:29:36.806725 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-frr-files/0.log" Oct 02 12:29:36 crc kubenswrapper[4835]: I1002 12:29:36.829411 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-metrics/0.log" Oct 02 12:29:36 crc kubenswrapper[4835]: I1002 12:29:36.839991 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-reloader/0.log" Oct 02 12:29:36 crc kubenswrapper[4835]: I1002 12:29:36.855090 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-metrics/0.log" Oct 02 12:29:37 crc kubenswrapper[4835]: I1002 12:29:37.021726 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-frr-files/0.log" Oct 02 12:29:37 crc kubenswrapper[4835]: I1002 12:29:37.029067 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-metrics/0.log" Oct 02 12:29:37 crc kubenswrapper[4835]: I1002 12:29:37.031362 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/cp-reloader/0.log" Oct 02 12:29:37 crc kubenswrapper[4835]: I1002 12:29:37.071890 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/controller/0.log" Oct 02 12:29:37 crc kubenswrapper[4835]: I1002 12:29:37.254818 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/frr-metrics/0.log" Oct 02 12:29:37 crc kubenswrapper[4835]: I1002 12:29:37.257310 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/kube-rbac-proxy/0.log" Oct 02 12:29:37 crc kubenswrapper[4835]: I1002 12:29:37.333865 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/kube-rbac-proxy-frr/0.log" Oct 02 12:29:37 crc kubenswrapper[4835]: I1002 12:29:37.468816 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/reloader/0.log" Oct 02 12:29:37 crc kubenswrapper[4835]: I1002 12:29:37.622266 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-slt97_487c5efa-50e7-4182-a79b-b3848a8e1bd4/frr-k8s-webhook-server/0.log" Oct 02 12:29:37 crc kubenswrapper[4835]: I1002 12:29:37.778875 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6cc456f76c-4fqdh_da4a91c1-ddf2-466a-99fd-dcc2be9dcb19/manager/0.log" Oct 02 12:29:37 crc kubenswrapper[4835]: I1002 12:29:37.949207 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-645c758f95-4lwwm_ce54775f-de78-454a-b9f0-6e17daec8861/webhook-server/0.log" Oct 02 12:29:38 crc kubenswrapper[4835]: I1002 12:29:38.130240 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-str4z_37346ff3-98f3-4dfc-b677-1f14e4b5a506/kube-rbac-proxy/0.log" Oct 02 12:29:38 crc kubenswrapper[4835]: I1002 12:29:38.663527 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-str4z_37346ff3-98f3-4dfc-b677-1f14e4b5a506/speaker/0.log" Oct 02 12:29:39 crc kubenswrapper[4835]: I1002 12:29:39.064378 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hdp2h_89a37e75-2e26-4eac-bd07-061b2cb7f0a9/frr/0.log" Oct 02 12:29:50 crc kubenswrapper[4835]: I1002 12:29:50.018539 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6_2e68d9dc-7ae9-44e2-be9e-88a18450e2db/util/0.log" Oct 02 12:29:50 crc kubenswrapper[4835]: I1002 12:29:50.285291 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6_2e68d9dc-7ae9-44e2-be9e-88a18450e2db/pull/0.log" Oct 02 12:29:50 crc kubenswrapper[4835]: I1002 12:29:50.291639 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6_2e68d9dc-7ae9-44e2-be9e-88a18450e2db/util/0.log" Oct 02 12:29:50 crc kubenswrapper[4835]: I1002 12:29:50.322484 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6_2e68d9dc-7ae9-44e2-be9e-88a18450e2db/pull/0.log" Oct 02 12:29:50 crc kubenswrapper[4835]: I1002 12:29:50.433114 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6_2e68d9dc-7ae9-44e2-be9e-88a18450e2db/util/0.log" Oct 02 12:29:50 crc kubenswrapper[4835]: I1002 12:29:50.501128 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6_2e68d9dc-7ae9-44e2-be9e-88a18450e2db/extract/0.log" Oct 02 12:29:50 crc kubenswrapper[4835]: I1002 12:29:50.514792 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2fnnb6_2e68d9dc-7ae9-44e2-be9e-88a18450e2db/pull/0.log" Oct 02 12:29:50 crc kubenswrapper[4835]: I1002 12:29:50.650438 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-76xgt_e50b1562-cfd0-4038-bb85-53e0c33838fc/extract-utilities/0.log" Oct 02 12:29:50 crc kubenswrapper[4835]: I1002 12:29:50.810006 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-76xgt_e50b1562-cfd0-4038-bb85-53e0c33838fc/extract-utilities/0.log" Oct 02 12:29:50 crc kubenswrapper[4835]: I1002 12:29:50.848631 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-76xgt_e50b1562-cfd0-4038-bb85-53e0c33838fc/extract-content/0.log" Oct 02 12:29:50 crc kubenswrapper[4835]: I1002 12:29:50.871404 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-76xgt_e50b1562-cfd0-4038-bb85-53e0c33838fc/extract-content/0.log" Oct 02 12:29:51 crc kubenswrapper[4835]: I1002 12:29:51.055358 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-76xgt_e50b1562-cfd0-4038-bb85-53e0c33838fc/extract-content/0.log" Oct 02 12:29:51 crc kubenswrapper[4835]: I1002 12:29:51.063164 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-76xgt_e50b1562-cfd0-4038-bb85-53e0c33838fc/extract-utilities/0.log" Oct 02 12:29:51 crc kubenswrapper[4835]: I1002 12:29:51.249796 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-76xgt_e50b1562-cfd0-4038-bb85-53e0c33838fc/registry-server/0.log" Oct 02 12:29:51 crc kubenswrapper[4835]: I1002 12:29:51.331450 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6blb_d544c135-38aa-4425-baf2-765c1f899617/extract-utilities/0.log" Oct 02 12:29:51 crc kubenswrapper[4835]: I1002 12:29:51.464960 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6blb_d544c135-38aa-4425-baf2-765c1f899617/extract-utilities/0.log" Oct 02 12:29:51 crc kubenswrapper[4835]: I1002 12:29:51.492411 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6blb_d544c135-38aa-4425-baf2-765c1f899617/extract-content/0.log" Oct 02 12:29:51 crc kubenswrapper[4835]: I1002 12:29:51.498424 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6blb_d544c135-38aa-4425-baf2-765c1f899617/extract-content/0.log" Oct 02 12:29:51 crc kubenswrapper[4835]: I1002 12:29:51.656563 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6blb_d544c135-38aa-4425-baf2-765c1f899617/extract-utilities/0.log" Oct 02 12:29:51 crc kubenswrapper[4835]: I1002 12:29:51.692549 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6blb_d544c135-38aa-4425-baf2-765c1f899617/extract-content/0.log" Oct 02 12:29:51 crc kubenswrapper[4835]: I1002 12:29:51.923984 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m_e133539a-9cba-48c1-896a-fb04fd1b3c14/util/0.log" Oct 02 12:29:52 crc kubenswrapper[4835]: I1002 12:29:52.170922 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m_e133539a-9cba-48c1-896a-fb04fd1b3c14/util/0.log" Oct 02 12:29:52 crc kubenswrapper[4835]: I1002 12:29:52.184938 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m_e133539a-9cba-48c1-896a-fb04fd1b3c14/pull/0.log" Oct 02 12:29:52 crc kubenswrapper[4835]: I1002 12:29:52.222306 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m_e133539a-9cba-48c1-896a-fb04fd1b3c14/pull/0.log" Oct 02 12:29:52 crc kubenswrapper[4835]: I1002 12:29:52.455936 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m_e133539a-9cba-48c1-896a-fb04fd1b3c14/pull/0.log" Oct 02 12:29:52 crc kubenswrapper[4835]: I1002 12:29:52.475806 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m_e133539a-9cba-48c1-896a-fb04fd1b3c14/util/0.log" Oct 02 12:29:52 crc kubenswrapper[4835]: I1002 12:29:52.476874 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfq67m_e133539a-9cba-48c1-896a-fb04fd1b3c14/extract/0.log" Oct 02 12:29:52 crc kubenswrapper[4835]: I1002 12:29:52.680735 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6blb_d544c135-38aa-4425-baf2-765c1f899617/registry-server/0.log" Oct 02 12:29:52 crc kubenswrapper[4835]: I1002 12:29:52.748375 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-25xg8_a38861d2-5ab5-49ec-ac3e-1980fd30757a/marketplace-operator/0.log" Oct 02 12:29:52 crc kubenswrapper[4835]: I1002 12:29:52.889175 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52mhn_d7cb39fe-2774-4e58-966a-78d55838e9f1/extract-utilities/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.050095 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52mhn_d7cb39fe-2774-4e58-966a-78d55838e9f1/extract-content/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.092247 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52mhn_d7cb39fe-2774-4e58-966a-78d55838e9f1/extract-utilities/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.114408 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52mhn_d7cb39fe-2774-4e58-966a-78d55838e9f1/extract-content/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.247191 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52mhn_d7cb39fe-2774-4e58-966a-78d55838e9f1/extract-utilities/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.302192 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52mhn_d7cb39fe-2774-4e58-966a-78d55838e9f1/extract-content/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.504826 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52mhn_d7cb39fe-2774-4e58-966a-78d55838e9f1/registry-server/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.523677 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9stw4_e5b7c17f-8144-41de-bd62-b8ec03c34fbf/extract-utilities/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.739281 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9stw4_e5b7c17f-8144-41de-bd62-b8ec03c34fbf/extract-content/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.740855 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9stw4_e5b7c17f-8144-41de-bd62-b8ec03c34fbf/extract-utilities/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.765928 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9stw4_e5b7c17f-8144-41de-bd62-b8ec03c34fbf/extract-content/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.926254 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9stw4_e5b7c17f-8144-41de-bd62-b8ec03c34fbf/extract-utilities/0.log" Oct 02 12:29:53 crc kubenswrapper[4835]: I1002 12:29:53.944260 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9stw4_e5b7c17f-8144-41de-bd62-b8ec03c34fbf/extract-content/0.log" Oct 02 12:29:54 crc kubenswrapper[4835]: I1002 12:29:54.711946 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9stw4_e5b7c17f-8144-41de-bd62-b8ec03c34fbf/registry-server/0.log" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.154126 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs"] Oct 02 12:30:00 crc kubenswrapper[4835]: E1002 12:30:00.155381 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" containerName="extract-content" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.155402 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" containerName="extract-content" Oct 02 12:30:00 crc kubenswrapper[4835]: E1002 12:30:00.155443 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" containerName="extract-utilities" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.155453 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" containerName="extract-utilities" Oct 02 12:30:00 crc kubenswrapper[4835]: E1002 12:30:00.155484 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" containerName="registry-server" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.155493 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" containerName="registry-server" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.155739 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf5ad25-dce9-4ead-b67f-785b499e5b9d" containerName="registry-server" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.156683 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.160108 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.160295 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.187479 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs"] Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.248438 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26d5c88d-f4e2-481d-b341-689f416a5517-secret-volume\") pod \"collect-profiles-29323470-6rcvs\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.248551 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg57l\" (UniqueName: \"kubernetes.io/projected/26d5c88d-f4e2-481d-b341-689f416a5517-kube-api-access-mg57l\") pod \"collect-profiles-29323470-6rcvs\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.248650 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26d5c88d-f4e2-481d-b341-689f416a5517-config-volume\") pod \"collect-profiles-29323470-6rcvs\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.349805 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26d5c88d-f4e2-481d-b341-689f416a5517-config-volume\") pod \"collect-profiles-29323470-6rcvs\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.350161 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26d5c88d-f4e2-481d-b341-689f416a5517-secret-volume\") pod \"collect-profiles-29323470-6rcvs\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.350275 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg57l\" (UniqueName: \"kubernetes.io/projected/26d5c88d-f4e2-481d-b341-689f416a5517-kube-api-access-mg57l\") pod \"collect-profiles-29323470-6rcvs\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.351457 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26d5c88d-f4e2-481d-b341-689f416a5517-config-volume\") pod \"collect-profiles-29323470-6rcvs\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.361077 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26d5c88d-f4e2-481d-b341-689f416a5517-secret-volume\") pod \"collect-profiles-29323470-6rcvs\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.384396 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg57l\" (UniqueName: \"kubernetes.io/projected/26d5c88d-f4e2-481d-b341-689f416a5517-kube-api-access-mg57l\") pod \"collect-profiles-29323470-6rcvs\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:00 crc kubenswrapper[4835]: I1002 12:30:00.506784 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:01 crc kubenswrapper[4835]: I1002 12:30:01.026796 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs"] Oct 02 12:30:01 crc kubenswrapper[4835]: I1002 12:30:01.772725 4835 generic.go:334] "Generic (PLEG): container finished" podID="26d5c88d-f4e2-481d-b341-689f416a5517" containerID="74017047b288f8bc5a2cd4b944635b2862c29742b3d049c5310ffc52d95229d5" exitCode=0 Oct 02 12:30:01 crc kubenswrapper[4835]: I1002 12:30:01.772868 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" event={"ID":"26d5c88d-f4e2-481d-b341-689f416a5517","Type":"ContainerDied","Data":"74017047b288f8bc5a2cd4b944635b2862c29742b3d049c5310ffc52d95229d5"} Oct 02 12:30:01 crc kubenswrapper[4835]: I1002 12:30:01.773063 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" event={"ID":"26d5c88d-f4e2-481d-b341-689f416a5517","Type":"ContainerStarted","Data":"d7b289d9c4f5ed726d5d8525dba3728caca748489f0cd3c832d09283fa0a3f54"} Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.222540 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.317527 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg57l\" (UniqueName: \"kubernetes.io/projected/26d5c88d-f4e2-481d-b341-689f416a5517-kube-api-access-mg57l\") pod \"26d5c88d-f4e2-481d-b341-689f416a5517\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.317570 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26d5c88d-f4e2-481d-b341-689f416a5517-config-volume\") pod \"26d5c88d-f4e2-481d-b341-689f416a5517\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.317643 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26d5c88d-f4e2-481d-b341-689f416a5517-secret-volume\") pod \"26d5c88d-f4e2-481d-b341-689f416a5517\" (UID: \"26d5c88d-f4e2-481d-b341-689f416a5517\") " Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.319902 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d5c88d-f4e2-481d-b341-689f416a5517-config-volume" (OuterVolumeSpecName: "config-volume") pod "26d5c88d-f4e2-481d-b341-689f416a5517" (UID: "26d5c88d-f4e2-481d-b341-689f416a5517"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.326115 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d5c88d-f4e2-481d-b341-689f416a5517-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "26d5c88d-f4e2-481d-b341-689f416a5517" (UID: "26d5c88d-f4e2-481d-b341-689f416a5517"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.326126 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d5c88d-f4e2-481d-b341-689f416a5517-kube-api-access-mg57l" (OuterVolumeSpecName: "kube-api-access-mg57l") pod "26d5c88d-f4e2-481d-b341-689f416a5517" (UID: "26d5c88d-f4e2-481d-b341-689f416a5517"). InnerVolumeSpecName "kube-api-access-mg57l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.420541 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg57l\" (UniqueName: \"kubernetes.io/projected/26d5c88d-f4e2-481d-b341-689f416a5517-kube-api-access-mg57l\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.420605 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26d5c88d-f4e2-481d-b341-689f416a5517-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.420625 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26d5c88d-f4e2-481d-b341-689f416a5517-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.803137 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" event={"ID":"26d5c88d-f4e2-481d-b341-689f416a5517","Type":"ContainerDied","Data":"d7b289d9c4f5ed726d5d8525dba3728caca748489f0cd3c832d09283fa0a3f54"} Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.803192 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7b289d9c4f5ed726d5d8525dba3728caca748489f0cd3c832d09283fa0a3f54" Oct 02 12:30:03 crc kubenswrapper[4835]: I1002 12:30:03.803200 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-6rcvs" Oct 02 12:30:04 crc kubenswrapper[4835]: I1002 12:30:04.305310 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4"] Oct 02 12:30:04 crc kubenswrapper[4835]: I1002 12:30:04.313100 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-jdvr4"] Oct 02 12:30:06 crc kubenswrapper[4835]: I1002 12:30:06.268996 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="491d8503-b7e0-41af-9fce-7ea8f1344ff3" path="/var/lib/kubelet/pods/491d8503-b7e0-41af-9fce-7ea8f1344ff3/volumes" Oct 02 12:31:02 crc kubenswrapper[4835]: I1002 12:31:02.704523 4835 scope.go:117] "RemoveContainer" containerID="e4009de0d456aef2eb5d606411b04231830086bb6a0060a19f11b663a8faa58a" Oct 02 12:31:11 crc kubenswrapper[4835]: I1002 12:31:11.983906 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:31:11 crc kubenswrapper[4835]: I1002 12:31:11.984361 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:31:41 crc kubenswrapper[4835]: I1002 12:31:41.984468 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:31:41 crc kubenswrapper[4835]: I1002 12:31:41.985048 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:32:11 crc kubenswrapper[4835]: I1002 12:32:11.984280 4835 patch_prober.go:28] interesting pod/machine-config-daemon-5ckb9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:32:11 crc kubenswrapper[4835]: I1002 12:32:11.984729 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:32:11 crc kubenswrapper[4835]: I1002 12:32:11.984788 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" Oct 02 12:32:11 crc kubenswrapper[4835]: I1002 12:32:11.985606 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd5656421c8d6ecb2c1780bcbf4ce2378ab8835bd3a3320e051dd565eb10b4b8"} pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:32:11 crc kubenswrapper[4835]: I1002 12:32:11.985661 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" podUID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerName="machine-config-daemon" containerID="cri-o://fd5656421c8d6ecb2c1780bcbf4ce2378ab8835bd3a3320e051dd565eb10b4b8" gracePeriod=600 Oct 02 12:32:13 crc kubenswrapper[4835]: I1002 12:32:13.004292 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce0ad186-63b7-432a-a0ca-4d4cbde057a8" containerID="fd5656421c8d6ecb2c1780bcbf4ce2378ab8835bd3a3320e051dd565eb10b4b8" exitCode=0 Oct 02 12:32:13 crc kubenswrapper[4835]: I1002 12:32:13.004430 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerDied","Data":"fd5656421c8d6ecb2c1780bcbf4ce2378ab8835bd3a3320e051dd565eb10b4b8"} Oct 02 12:32:13 crc kubenswrapper[4835]: I1002 12:32:13.004891 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5ckb9" event={"ID":"ce0ad186-63b7-432a-a0ca-4d4cbde057a8","Type":"ContainerStarted","Data":"43fe32ad17f452f4e7127f564d36fabf296534f6c21c0de53ac364b212ecc367"} Oct 02 12:32:13 crc kubenswrapper[4835]: I1002 12:32:13.004914 4835 scope.go:117] "RemoveContainer" containerID="dc832ac38b9d1137d3f5c041b166c4ec0b01d4beaaa5c986b93cf2cb39ad2a9a" Oct 02 12:32:21 crc kubenswrapper[4835]: I1002 12:32:21.095882 4835 generic.go:334] "Generic (PLEG): container finished" podID="9d5a0b00-da61-493c-a758-56d420fe4971" containerID="7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b" exitCode=0 Oct 02 12:32:21 crc kubenswrapper[4835]: I1002 12:32:21.095957 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnfs2/must-gather-g9662" event={"ID":"9d5a0b00-da61-493c-a758-56d420fe4971","Type":"ContainerDied","Data":"7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b"} Oct 02 12:32:21 crc kubenswrapper[4835]: I1002 12:32:21.097205 4835 scope.go:117] "RemoveContainer" containerID="7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b" Oct 02 12:32:21 crc kubenswrapper[4835]: I1002 12:32:21.380797 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nnfs2_must-gather-g9662_9d5a0b00-da61-493c-a758-56d420fe4971/gather/0.log" Oct 02 12:32:30 crc kubenswrapper[4835]: I1002 12:32:30.456445 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nnfs2/must-gather-g9662"] Oct 02 12:32:30 crc kubenswrapper[4835]: I1002 12:32:30.457138 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-nnfs2/must-gather-g9662" podUID="9d5a0b00-da61-493c-a758-56d420fe4971" containerName="copy" containerID="cri-o://9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1" gracePeriod=2 Oct 02 12:32:30 crc kubenswrapper[4835]: I1002 12:32:30.466182 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nnfs2/must-gather-g9662"] Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.020075 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nnfs2_must-gather-g9662_9d5a0b00-da61-493c-a758-56d420fe4971/copy/0.log" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.029935 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/must-gather-g9662" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.125917 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d5a0b00-da61-493c-a758-56d420fe4971-must-gather-output\") pod \"9d5a0b00-da61-493c-a758-56d420fe4971\" (UID: \"9d5a0b00-da61-493c-a758-56d420fe4971\") " Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.126018 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpfhh\" (UniqueName: \"kubernetes.io/projected/9d5a0b00-da61-493c-a758-56d420fe4971-kube-api-access-fpfhh\") pod \"9d5a0b00-da61-493c-a758-56d420fe4971\" (UID: \"9d5a0b00-da61-493c-a758-56d420fe4971\") " Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.133349 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d5a0b00-da61-493c-a758-56d420fe4971-kube-api-access-fpfhh" (OuterVolumeSpecName: "kube-api-access-fpfhh") pod "9d5a0b00-da61-493c-a758-56d420fe4971" (UID: "9d5a0b00-da61-493c-a758-56d420fe4971"). InnerVolumeSpecName "kube-api-access-fpfhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.206274 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nnfs2_must-gather-g9662_9d5a0b00-da61-493c-a758-56d420fe4971/copy/0.log" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.208861 4835 generic.go:334] "Generic (PLEG): container finished" podID="9d5a0b00-da61-493c-a758-56d420fe4971" containerID="9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1" exitCode=143 Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.208932 4835 scope.go:117] "RemoveContainer" containerID="9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.208996 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnfs2/must-gather-g9662" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.229085 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpfhh\" (UniqueName: \"kubernetes.io/projected/9d5a0b00-da61-493c-a758-56d420fe4971-kube-api-access-fpfhh\") on node \"crc\" DevicePath \"\"" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.246339 4835 scope.go:117] "RemoveContainer" containerID="7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.356260 4835 scope.go:117] "RemoveContainer" containerID="9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1" Oct 02 12:32:31 crc kubenswrapper[4835]: E1002 12:32:31.357487 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1\": container with ID starting with 9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1 not found: ID does not exist" containerID="9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.357564 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1"} err="failed to get container status \"9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1\": rpc error: code = NotFound desc = could not find container \"9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1\": container with ID starting with 9b9faec8c5783b21d1f7a31a345a58717fdb4c03c537cd18179514396ab2f4f1 not found: ID does not exist" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.357602 4835 scope.go:117] "RemoveContainer" containerID="7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b" Oct 02 12:32:31 crc kubenswrapper[4835]: E1002 12:32:31.358409 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b\": container with ID starting with 7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b not found: ID does not exist" containerID="7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.358448 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b"} err="failed to get container status \"7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b\": rpc error: code = NotFound desc = could not find container \"7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b\": container with ID starting with 7c23b45f9231dcdbd53c901174b0f16286b548e4fa883608e68969ca7c7a9e8b not found: ID does not exist" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.367050 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d5a0b00-da61-493c-a758-56d420fe4971-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9d5a0b00-da61-493c-a758-56d420fe4971" (UID: "9d5a0b00-da61-493c-a758-56d420fe4971"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:32:31 crc kubenswrapper[4835]: I1002 12:32:31.434034 4835 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d5a0b00-da61-493c-a758-56d420fe4971-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 12:32:32 crc kubenswrapper[4835]: I1002 12:32:32.261791 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d5a0b00-da61-493c-a758-56d420fe4971" path="/var/lib/kubelet/pods/9d5a0b00-da61-493c-a758-56d420fe4971/volumes" Oct 02 12:33:02 crc kubenswrapper[4835]: I1002 12:33:02.782283 4835 scope.go:117] "RemoveContainer" containerID="d878f12789396ba88162a95016421042b8f371741f260b99ac2c492439f50eff" Oct 02 12:33:47 crc kubenswrapper[4835]: I1002 12:33:47.976964 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9xg5l"] Oct 02 12:33:47 crc kubenswrapper[4835]: E1002 12:33:47.980838 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d5c88d-f4e2-481d-b341-689f416a5517" containerName="collect-profiles" Oct 02 12:33:47 crc kubenswrapper[4835]: I1002 12:33:47.980864 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d5c88d-f4e2-481d-b341-689f416a5517" containerName="collect-profiles" Oct 02 12:33:47 crc kubenswrapper[4835]: E1002 12:33:47.980899 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d5a0b00-da61-493c-a758-56d420fe4971" containerName="copy" Oct 02 12:33:47 crc kubenswrapper[4835]: I1002 12:33:47.980909 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5a0b00-da61-493c-a758-56d420fe4971" containerName="copy" Oct 02 12:33:47 crc kubenswrapper[4835]: E1002 12:33:47.980925 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d5a0b00-da61-493c-a758-56d420fe4971" containerName="gather" Oct 02 12:33:47 crc kubenswrapper[4835]: I1002 12:33:47.980935 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5a0b00-da61-493c-a758-56d420fe4971" containerName="gather" Oct 02 12:33:47 crc kubenswrapper[4835]: I1002 12:33:47.981190 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d5a0b00-da61-493c-a758-56d420fe4971" containerName="gather" Oct 02 12:33:47 crc kubenswrapper[4835]: I1002 12:33:47.981246 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d5c88d-f4e2-481d-b341-689f416a5517" containerName="collect-profiles" Oct 02 12:33:47 crc kubenswrapper[4835]: I1002 12:33:47.981267 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d5a0b00-da61-493c-a758-56d420fe4971" containerName="copy" Oct 02 12:33:47 crc kubenswrapper[4835]: I1002 12:33:47.983271 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:47 crc kubenswrapper[4835]: I1002 12:33:47.990686 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9xg5l"] Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.114933 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-utilities\") pod \"community-operators-9xg5l\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.115335 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-catalog-content\") pod \"community-operators-9xg5l\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.115542 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt5s8\" (UniqueName: \"kubernetes.io/projected/c65a8cd5-858e-4934-9de1-ec7789e3d43b-kube-api-access-jt5s8\") pod \"community-operators-9xg5l\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.217351 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-catalog-content\") pod \"community-operators-9xg5l\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.217460 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt5s8\" (UniqueName: \"kubernetes.io/projected/c65a8cd5-858e-4934-9de1-ec7789e3d43b-kube-api-access-jt5s8\") pod \"community-operators-9xg5l\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.217600 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-utilities\") pod \"community-operators-9xg5l\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.217967 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-catalog-content\") pod \"community-operators-9xg5l\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.217984 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-utilities\") pod \"community-operators-9xg5l\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.241776 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt5s8\" (UniqueName: \"kubernetes.io/projected/c65a8cd5-858e-4934-9de1-ec7789e3d43b-kube-api-access-jt5s8\") pod \"community-operators-9xg5l\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.310190 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.863408 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9xg5l"] Oct 02 12:33:48 crc kubenswrapper[4835]: I1002 12:33:48.901712 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xg5l" event={"ID":"c65a8cd5-858e-4934-9de1-ec7789e3d43b","Type":"ContainerStarted","Data":"8d1ee0a1b95ebc9923d99c35b30399b688299e37804f700cee758df888e75c58"} Oct 02 12:33:49 crc kubenswrapper[4835]: I1002 12:33:49.912952 4835 generic.go:334] "Generic (PLEG): container finished" podID="c65a8cd5-858e-4934-9de1-ec7789e3d43b" containerID="68dba66e984c8100e569d1e72c954c4a96068758d3d304966a51a19489332458" exitCode=0 Oct 02 12:33:49 crc kubenswrapper[4835]: I1002 12:33:49.913050 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xg5l" event={"ID":"c65a8cd5-858e-4934-9de1-ec7789e3d43b","Type":"ContainerDied","Data":"68dba66e984c8100e569d1e72c954c4a96068758d3d304966a51a19489332458"} Oct 02 12:33:49 crc kubenswrapper[4835]: I1002 12:33:49.917278 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:33:52 crc kubenswrapper[4835]: I1002 12:33:52.947599 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xg5l" event={"ID":"c65a8cd5-858e-4934-9de1-ec7789e3d43b","Type":"ContainerStarted","Data":"386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5"} Oct 02 12:33:54 crc kubenswrapper[4835]: I1002 12:33:54.977829 4835 generic.go:334] "Generic (PLEG): container finished" podID="c65a8cd5-858e-4934-9de1-ec7789e3d43b" containerID="386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5" exitCode=0 Oct 02 12:33:54 crc kubenswrapper[4835]: I1002 12:33:54.977901 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xg5l" event={"ID":"c65a8cd5-858e-4934-9de1-ec7789e3d43b","Type":"ContainerDied","Data":"386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5"} Oct 02 12:33:55 crc kubenswrapper[4835]: I1002 12:33:55.991246 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xg5l" event={"ID":"c65a8cd5-858e-4934-9de1-ec7789e3d43b","Type":"ContainerStarted","Data":"aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1"} Oct 02 12:33:56 crc kubenswrapper[4835]: I1002 12:33:56.021205 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9xg5l" podStartSLOduration=3.4855043 podStartE2EDuration="9.021182031s" podCreationTimestamp="2025-10-02 12:33:47 +0000 UTC" firstStartedPulling="2025-10-02 12:33:49.91694433 +0000 UTC m=+5906.476851931" lastFinishedPulling="2025-10-02 12:33:55.452622081 +0000 UTC m=+5912.012529662" observedRunningTime="2025-10-02 12:33:56.01558974 +0000 UTC m=+5912.575497351" watchObservedRunningTime="2025-10-02 12:33:56.021182031 +0000 UTC m=+5912.581089612" Oct 02 12:33:58 crc kubenswrapper[4835]: I1002 12:33:58.311079 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:58 crc kubenswrapper[4835]: I1002 12:33:58.312047 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:33:58 crc kubenswrapper[4835]: I1002 12:33:58.354513 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:34:08 crc kubenswrapper[4835]: I1002 12:34:08.373811 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:34:08 crc kubenswrapper[4835]: I1002 12:34:08.426313 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9xg5l"] Oct 02 12:34:09 crc kubenswrapper[4835]: I1002 12:34:09.110358 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9xg5l" podUID="c65a8cd5-858e-4934-9de1-ec7789e3d43b" containerName="registry-server" containerID="cri-o://aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1" gracePeriod=2 Oct 02 12:34:09 crc kubenswrapper[4835]: I1002 12:34:09.587934 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:34:09 crc kubenswrapper[4835]: I1002 12:34:09.648351 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-utilities\") pod \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " Oct 02 12:34:09 crc kubenswrapper[4835]: I1002 12:34:09.648414 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-catalog-content\") pod \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " Oct 02 12:34:09 crc kubenswrapper[4835]: I1002 12:34:09.648515 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt5s8\" (UniqueName: \"kubernetes.io/projected/c65a8cd5-858e-4934-9de1-ec7789e3d43b-kube-api-access-jt5s8\") pod \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\" (UID: \"c65a8cd5-858e-4934-9de1-ec7789e3d43b\") " Oct 02 12:34:09 crc kubenswrapper[4835]: I1002 12:34:09.650178 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-utilities" (OuterVolumeSpecName: "utilities") pod "c65a8cd5-858e-4934-9de1-ec7789e3d43b" (UID: "c65a8cd5-858e-4934-9de1-ec7789e3d43b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:34:09 crc kubenswrapper[4835]: I1002 12:34:09.655813 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65a8cd5-858e-4934-9de1-ec7789e3d43b-kube-api-access-jt5s8" (OuterVolumeSpecName: "kube-api-access-jt5s8") pod "c65a8cd5-858e-4934-9de1-ec7789e3d43b" (UID: "c65a8cd5-858e-4934-9de1-ec7789e3d43b"). InnerVolumeSpecName "kube-api-access-jt5s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:34:09 crc kubenswrapper[4835]: I1002 12:34:09.704660 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c65a8cd5-858e-4934-9de1-ec7789e3d43b" (UID: "c65a8cd5-858e-4934-9de1-ec7789e3d43b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:34:09 crc kubenswrapper[4835]: I1002 12:34:09.751209 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:34:09 crc kubenswrapper[4835]: I1002 12:34:09.751282 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a8cd5-858e-4934-9de1-ec7789e3d43b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:34:09 crc kubenswrapper[4835]: I1002 12:34:09.751299 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt5s8\" (UniqueName: \"kubernetes.io/projected/c65a8cd5-858e-4934-9de1-ec7789e3d43b-kube-api-access-jt5s8\") on node \"crc\" DevicePath \"\"" Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.131118 4835 generic.go:334] "Generic (PLEG): container finished" podID="c65a8cd5-858e-4934-9de1-ec7789e3d43b" containerID="aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1" exitCode=0 Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.131179 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xg5l" event={"ID":"c65a8cd5-858e-4934-9de1-ec7789e3d43b","Type":"ContainerDied","Data":"aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1"} Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.131234 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xg5l" event={"ID":"c65a8cd5-858e-4934-9de1-ec7789e3d43b","Type":"ContainerDied","Data":"8d1ee0a1b95ebc9923d99c35b30399b688299e37804f700cee758df888e75c58"} Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.131265 4835 scope.go:117] "RemoveContainer" containerID="aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1" Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.131515 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xg5l" Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.157529 4835 scope.go:117] "RemoveContainer" containerID="386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5" Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.179705 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9xg5l"] Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.188743 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9xg5l"] Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.190722 4835 scope.go:117] "RemoveContainer" containerID="68dba66e984c8100e569d1e72c954c4a96068758d3d304966a51a19489332458" Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.230602 4835 scope.go:117] "RemoveContainer" containerID="aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1" Oct 02 12:34:10 crc kubenswrapper[4835]: E1002 12:34:10.231174 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1\": container with ID starting with aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1 not found: ID does not exist" containerID="aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1" Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.231238 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1"} err="failed to get container status \"aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1\": rpc error: code = NotFound desc = could not find container \"aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1\": container with ID starting with aa351584852fcce14395df7cc8b7800f9cfd942d7beadb68692d1047337d4de1 not found: ID does not exist" Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.231278 4835 scope.go:117] "RemoveContainer" containerID="386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5" Oct 02 12:34:10 crc kubenswrapper[4835]: E1002 12:34:10.231743 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5\": container with ID starting with 386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5 not found: ID does not exist" containerID="386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5" Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.231822 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5"} err="failed to get container status \"386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5\": rpc error: code = NotFound desc = could not find container \"386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5\": container with ID starting with 386972db7b7497fb96f2e198d318324660f5421217b6678a2a6e0fc00a174ed5 not found: ID does not exist" Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.231865 4835 scope.go:117] "RemoveContainer" containerID="68dba66e984c8100e569d1e72c954c4a96068758d3d304966a51a19489332458" Oct 02 12:34:10 crc kubenswrapper[4835]: E1002 12:34:10.232504 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68dba66e984c8100e569d1e72c954c4a96068758d3d304966a51a19489332458\": container with ID starting with 68dba66e984c8100e569d1e72c954c4a96068758d3d304966a51a19489332458 not found: ID does not exist" containerID="68dba66e984c8100e569d1e72c954c4a96068758d3d304966a51a19489332458" Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.232733 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68dba66e984c8100e569d1e72c954c4a96068758d3d304966a51a19489332458"} err="failed to get container status \"68dba66e984c8100e569d1e72c954c4a96068758d3d304966a51a19489332458\": rpc error: code = NotFound desc = could not find container \"68dba66e984c8100e569d1e72c954c4a96068758d3d304966a51a19489332458\": container with ID starting with 68dba66e984c8100e569d1e72c954c4a96068758d3d304966a51a19489332458 not found: ID does not exist" Oct 02 12:34:10 crc kubenswrapper[4835]: I1002 12:34:10.265070 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65a8cd5-858e-4934-9de1-ec7789e3d43b" path="/var/lib/kubelet/pods/c65a8cd5-858e-4934-9de1-ec7789e3d43b/volumes"